Code of Federal Regulations, 2011 CFR
2011-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2010 CFR
2010-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2012 CFR
2012-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2013 CFR
2013-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2014 CFR
2014-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
77 FR 4002 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
... the methodological research previously included in the original System of Record Notice (SORN). This... methodological research on improving various aspects of surveys authorized by Title 13, U.S.C. 8(b), 182, and 196, such as: survey sampling frame design; sample selection algorithms; questionnaire development, design...
Documentation of indigenous Pacific agroforestry systems: a review of methodologies
Bill Raynor
1993-01-01
Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...
RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS
The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...
A novel method to scale up fungal endophyte isolations
USDA-ARS?s Scientific Manuscript database
Estimations of species diversity are influenced by sampling intensity which in turn is influenced by methodology. For fungal endophyte diversity studies, the methodology includes surface-sterilization prior to isolation of endophytes. Surface-sterilization is an essential component of fungal endophy...
New Methodology for Natural Gas Production Estimates
2010-01-01
A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
Roadmap for Navy Family Research.
1980-08-01
of methodological limitations, including: small, often non -representative or narrowly defined samples; inadequate statistical controls, inadequate...1-1 1.2 Overview of the Research Roadmap ..................... 1-2 2. Methodology ...the Office of Naval Research by the Westinghouse Public Applied Systems Division, and is designed to provide the Navy with a systematic framework for
Social Competence in Late Elementary School: Relationships to Parenting and Neighborhood Context
ERIC Educational Resources Information Center
Caughy, Margaret O'Brien; Franzini, Luisa; Windle, Michael; Dittus, Patricia; Cuccaro, Paula; Elliott, Marc N.; Schuster, Mark A.
2012-01-01
Despite evidence that neighborhoods confer both risk and resilience for youth development, the existing neighborhood research has a number of methodological limitations including lack of diversity in neighborhoods sampled and neighborhood characteristics assessed. The purpose of this study was to address these methodological limitations of…
Sampling methods to the statistical control of the production of blood components.
Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo
2017-12-01
The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
Soininen, Päivi; Putkonen, Hanna; Joffe, Grigori; Korkeila, Jyrki; Välimäki, Maritta
2014-06-04
Despite improvements in psychiatric inpatient care, patient restrictions in psychiatric hospitals are still in use. Studying perceptions among patients who have been secluded or physically restrained during their hospital stay is challenging. We sought to review the methodological and ethical challenges in qualitative and quantitative studies aiming to describe patients' perceptions of coercive measures, especially seclusion and physical restraints during their hospital stay. Systematic mixed studies review was the study method. Studies reporting patients' perceptions of coercive measures, especially seclusion and physical restraints during hospital stay were included. Methodological issues such as study design, data collection and recruitment process, participants, sampling, patient refusal or non-participation, and ethical issues such as informed consent process, and approval were synthesized systematically. Electronic searches of CINALH, MEDLINE, PsychINFO and The Cochrane Library (1976-2012) were carried out. Out of 846 initial citations, 32 studies were included, 14 qualitative and 18 quantitative studies. A variety of methodological approaches were used, although descriptive and explorative designs were used in most cases. Data were mainly collected in qualitative studies by interviews (n = 13) or in quantitative studies by self-report questionnaires (n = 12). The recruitment process was explained in 59% (n = 19) of the studies. In most cases convenience sampling was used, yet five studies used randomization. Patient's refusal or non-participation was reported in 37% (n = 11) of studies. Of all studies, 56% (n = 18) had reported undergone an ethical review process in an official board or committee. Respondents were informed and consent was requested in 69% studies (n = 22). The use of different study designs made comparison methodologically challenging. The timing of data collection (considering bias and confounding factors) and the reasons for non-participation of eligible participants are likewise methodological challenges, e.g. recommended flow charts could aid the information. Other challenges identified were the recruitment of large and representative samples. Ethical challenges included requesting participants' informed consent and respecting ethical procedures.
Adaptive Oceanographic Sampling in a Coastal Environment Using Autonomous Gliding Vehicles
2003-08-01
cost autonomous vehicles with near-global range and modular sensor payload. Particular emphasis is placed on the development of adaptive sampling...environment. Secondary objectives include continued development of adaptive sampling strategies suitable for large fleets of slow-moving autonomous ... vehicles , and development and implementation of new oceanographic sensors and sampling methodologies. The main task completed was a complete redesign of
A design methodology for nonlinear systems containing parameter uncertainty
NASA Technical Reports Server (NTRS)
Young, G. E.; Auslander, D. M.
1983-01-01
In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.
Zimmermann, Boris; Kohler, Achim
2014-01-01
Background It is imperative to have reliable and timely methodologies for analysis and monitoring of seed plants in order to determine climate-related plant processes. Moreover, impact of environment on plant fitness is predominantly based on studies of female functions, while the contribution of male gametophytes is mostly ignored due to missing data on pollen quality. We explored the use of infrared spectroscopy of pollen for an inexpensive and rapid characterization of plants. Methodology The study was based on measurement of pollen samples by two Fourier transform infrared techniques: single reflectance attenuated total reflectance and transmission measurement of sample pellets. The experimental set, with a total of 813 samples, included five pollination seasons and 300 different plant species belonging to all principal spermatophyte clades (conifers, monocotyledons, eudicots, and magnoliids). Results The spectroscopic-based methodology enables detection of phylogenetic variations, including the separation of confamiliar and congeneric species. Furthermore, the methodology enables measurement of phenotypic plasticity by the detection of inter-annual variations within the populations. The spectral differences related to environment and taxonomy are interpreted biochemically, specifically variations of pollen lipids, proteins, carbohydrates, and sporopollenins. The study shows large variations of absolute content of nutrients for congenital species pollinating in the same environmental conditions. Moreover, clear correlation between carbohydrate-to-protein ratio and pollination strategy has been detected. Infrared spectral database with respect to biochemical variation among the range of species, climate and biogeography will significantly improve comprehension of plant-environment interactions, including impact of global climate change on plant communities. PMID:24748390
National Sample Survey of Registered Nurses II. Status of Nurses: November 1980.
ERIC Educational Resources Information Center
Bentley, Barbara S.; And Others
This report provides data describing the nursing population as determined by the second national sample survey of registered nurses. A brief introduction is followed by a chapter that presents an overview of the survey methodology, including details on the sampling design, the response rate, and the statistical reliability. Chapter 3 provides a…
Measuring solids concentration in stormwater runoff: comparison of analytical methods.
Clark, Shirley E; Siu, Christina Y S
2008-01-15
Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.
Falkenhaug, Tone; Baxter, Emily J.
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data. PMID:29095891
Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data.
40 CFR 85.2120 - Maintenance and submittal of records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...
An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.
ERIC Educational Resources Information Center
Kay, Robin
1992-01-01
Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…
A Modern Approach to College Analytical Chemistry.
ERIC Educational Resources Information Center
Neman, R. L.
1983-01-01
Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vitkus, Timothy J.
2012-04-24
This guidance provides information on methodologies and the technical bases that licensees should consider for incorporating composite sampling strategies into final status survey (FSS) plans. In addition, this guidance also includes appropriate uses of composite sampling for generating the data for other decommissioning site investigations such as characterization or other preliminary site investigations.
Ferrer, Imma; Thurman, E Michael
2012-10-12
A straightforward methodology for the chromatographic separation and accurate mass identification of 100 pharmaceuticals including some of their degradation products was developed using liquid chromatography/quadrupole time-of-flight mass spectrometry (LC/Q-TOF-MS). A table compiling the protonated or deprotonated exact masses for all compounds, as well as the exact mass of several fragment ions obtained by MS-MS is included. Excellent chromatographic separation was achieved by using 3.5 μm particle size columns and a slow and generic 30-min gradient. Isobaric and isomeric compounds (same nominal mass and same exact mass, respectively) were distinguished by various methods, including chromatography separation, MS-MS fragmentation, and isotopic signal identification. Method reporting limits of detection ranged from 1 to 1000 ng/L, after solid-phase extraction of 100mL aqueous samples. The methodology was successfully applied to the analysis of surface water impacted by wastewater effluent by identifying many of the pharmaceuticals and metabolites included in the list. Examples are given for some of the most unusual findings in environmental samples. This paper is meant to serve as a guide for those doing analysis of pharmaceuticals in environmental samples, by providing exact mass measurements of several well known, as well as newly identified and environmentally relevant pharmaceuticals in water samples. Copyright © 2012 Elsevier B.V. All rights reserved.
Statistical power calculations for mixed pharmacokinetic study designs using a population approach.
Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel
2014-09-01
Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.
2014-01-01
Background Despite improvements in psychiatric inpatient care, patient restrictions in psychiatric hospitals are still in use. Studying perceptions among patients who have been secluded or physically restrained during their hospital stay is challenging. We sought to review the methodological and ethical challenges in qualitative and quantitative studies aiming to describe patients’ perceptions of coercive measures, especially seclusion and physical restraints during their hospital stay. Methods Systematic mixed studies review was the study method. Studies reporting patients’ perceptions of coercive measures, especially seclusion and physical restraints during hospital stay were included. Methodological issues such as study design, data collection and recruitment process, participants, sampling, patient refusal or non-participation, and ethical issues such as informed consent process, and approval were synthesized systematically. Electronic searches of CINALH, MEDLINE, PsychINFO and The Cochrane Library (1976-2012) were carried out. Results Out of 846 initial citations, 32 studies were included, 14 qualitative and 18 quantitative studies. A variety of methodological approaches were used, although descriptive and explorative designs were used in most cases. Data were mainly collected in qualitative studies by interviews (n = 13) or in quantitative studies by self-report questionnaires (n = 12). The recruitment process was explained in 59% (n = 19) of the studies. In most cases convenience sampling was used, yet five studies used randomization. Patient’s refusal or non-participation was reported in 37% (n = 11) of studies. Of all studies, 56% (n = 18) had reported undergone an ethical review process in an official board or committee. Respondents were informed and consent was requested in 69% studies (n = 22). Conclusions The use of different study designs made comparison methodologically challenging. The timing of data collection (considering bias and confounding factors) and the reasons for non-participation of eligible participants are likewise methodological challenges, e.g. recommended flow charts could aid the information. Other challenges identified were the recruitment of large and representative samples. Ethical challenges included requesting participants’ informed consent and respecting ethical procedures. PMID:24894162
Methodological reporting of randomized clinical trials in respiratory research in 2010.
Lu, Yi; Yao, Qiuju; Gu, Jie; Shen, Ce
2013-09-01
Although randomized controlled trials (RCTs) are considered the highest level of evidence, they are also subject to bias, due to a lack of adequately reported randomization, and therefore the reporting should be as explicit as possible for readers to determine the significance of the contents. We evaluated the methodological quality of RCTs in respiratory research in high ranking clinical journals, published in 2010. We assessed the methodological quality, including generation of the allocation sequence, allocation concealment, double-blinding, sample-size calculation, intention-to-treat analysis, flow diagrams, number of medical centers involved, diseases, funding sources, types of interventions, trial registration, number of times the papers have been cited, journal impact factor, journal type, and journal endorsement of the CONSORT (Consolidated Standards of Reporting Trials) rules, in RCTs published in 12 top ranking clinical respiratory journals and 5 top ranking general medical journals. We included 176 trials, of which 93 (53%) reported adequate generation of the allocation sequence, 66 (38%) reported adequate allocation concealment, 79 (45%) were double-blind, 123 (70%) reported adequate sample-size calculation, 88 (50%) reported intention-to-treat analysis, and 122 (69%) included a flow diagram. Multivariate logistic regression analysis revealed that journal impact factor ≥ 5 was the only variable that significantly influenced adequate allocation sequence generation. Trial registration and journal impact factor ≥ 5 significantly influenced adequate allocation concealment. Medical interventions, trial registration, and journal endorsement of the CONSORT statement influenced adequate double-blinding. Publication in one of the general medical journal influenced adequate sample-size calculation. The methodological quality of RCTs in respiratory research needs improvement. Stricter enforcement of the CONSORT statement should enhance the quality of RCTs.
A systematic review of grounded theory studies in physiotherapy.
Ali, Nancy; May, Stephen; Grafton, Kate
2018-05-23
This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.
Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy
2013-01-01
This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling procedure. Data were collected during a 2-hour Friday daytime session at 60 locations and during 2-hour nighttime weekend periods at 240 locations. Both self-report and biological measures were taken. Biological measures included breath alcohol measurements from 9,413 respondents, oral fluid samples from 7,719 respondents, and blood samples from 3,276 respondents. PMID:21997324
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín
2010-01-01
The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532
Factors Associated with Job Content Plateauing among Older Workers
ERIC Educational Resources Information Center
Armstrong-Stassen, Marjorie
2008-01-01
Purpose: The purpose of this paper is to identify personal and work environment factors associated with the experience of job content plateauing among older workers. Design/methodology/approach: Two cross-sectional studies, each including two samples, were conducted. In each study, one sample consisted of a diverse group of older workers and the…
Faster the better: a reliable technique to sample anopluran lice in large hosts.
Leonardi, María Soledad
2014-06-01
Among Anoplura, the family Echinophthiriidae includes those species that infest mainly the pinnipeds. Working with large hosts implies methodological considerations as the time spent in the sampling, and the way in that the animal is restrained. Previous works on echinophthiriids combined a diverse array of analyses including field counts of lice and in vitro observations. To collect lice, the authors used forceps, and each louse was collected individually. This implied a long manipulation time, i.e., ≈60 min and the need to physically and/or chemically immobilize the animal. The present work described and discussed for the first a sample technique that minimized the manipulation time and also avoiding the use of anesthesia. This methodology implied combing the host's pelage with a fine-tooth plastic comb, as used in the treatment of human pediculosis, and keeping the comb with the lice retained in a Ziploc® bag with ethanol. This technique was used successfully in studies of population dynamic, habitat selection, and transmission pattern, being a reliable methodology. Lice are collected entirely and are in a good condition to prepare them for mounting for studying under light or scanning electron microscopy. Moreover, the use of the plastic comb protects from damaging taxonomically important structures as spines being also recommended to reach taxonomic or morphological goals.
Suyemoto, M M; Barnes, H J; Borst, L B
2017-03-01
Pathogenic strains of Enterococcus cecorum (EC) expressing multidrug resistance have emerged. In National Antimicrobial Resistance Monitoring System (NARMS) data, EC is rarely recovered from chickens. Two NARMS methodologies (FDA and USDA) were compared with standard culture (SC) techniques for recovery of EC. NARMS methods failed to detect EC in 58 caecal samples, 20 chicken breast or six whole broiler samples. EC was recovered from 1 of 38 (2·6%) and 2 of 38 (5·2%) preharvest spinal lesions (USDA and FDA method, respectively). In contrast, using the SC method, EC was recovered from 44 of 53 (83%) caecal samples, all 38 (100%) spinal lesions, 14 of 20 (70%) chicken breast samples, and all three spinal lesions identified in whole carcasses. Compared with other Enterococcus spp., EC isolates had a higher prevalence of resistance to macrolides. The NARMS methods significantly affected recovery of enterococcal species other than EC. When the postharvest FDA method was applied to preharvest caecal samples, isolates of Enterococcus faecium were preferentially recovered. All 11 E. faecium isolates were multidrug resistant, including resistance to penicillin, daptomycin and linezolid. These findings confirm that current methodologies may not accurately identify the amount and range of antimicrobial resistance of enterococci from chicken sources. Enterococci are an important reservoir for antimicrobial resistance. This study demonstrates how current culture methods underreport resistance to macrolides in enterococci by selecting against strains of Enterococcus cecorum in pre- and postharvest chicken. Further, the application of postharvest surveillance methods to preharvest samples resulted in selective recovery of Enterococcus faecium over Enterococcus faecalis. Isolates of E. faecium recovered exhibited multidrug resistance including penicillin, daptomycin and linezolid resistance. These findings suggest that culture methodology significantly impacts the range and amount of antimicrobial resistance detected in enterococci isolated from chicken. © 2016 The Society for Applied Microbiology.
Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C
2017-05-01
An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Assessing quality of reports on randomized clinical trials in nursing journals.
Parent, Nicole; Hanley, James A
2009-01-01
Several surveys have presented the quality of reports on randomized clinical trials (RCTs) published in general and specialty medical journals. The aim of these surveys was to raise scientific consciousness on methodological aspects pertaining to internal and external validity. These reviews have suggested that the methodological quality could be improved. We conducted a survey of reports on RCTs published in nursing journals to assess their methodological quality. The features we considered included sample size, flow of participants, assessment of baseline comparability, randomization, blinding, and statistical analysis. We collected data from all reports of RCTs published between January 1994 and December 1997 in Applied Nursing Research, Heart & Lung and Nursing Research. We hand-searched the journals and included all 54 articles in which authors reported that individuals have been randomly allocated to distinct groups. We collected data using a condensed form of the Consolidated Standards of Reporting Trials (CONSORT) statement for structured reporting of RCTs (Begg et al., 1996). Sample size calculations were included in only 22% of the reports. Only 48% of the reports provided information about the type of randomization, and a mere 22% described blinding strategies. Comparisons of baseline characteristics using hypothesis tests were abusively produced in more than 76% of the reports. Excessive use and unstructured reports of significance testing were common (59%), and all reports failed to provide magnitude of treatment differences with confidence intervals. Better methodological quality in reports of RCTs will contribute to increase the standards of nursing research.
Landolt, Alison S; Milling, Leonard S
2011-08-01
This paper presents a comprehensive methodological review of research on the efficacy of hypnosis for reducing labor and delivery pain. To be included, studies were required to use a between-subjects or mixed model design in which hypnosis was compared with a control condition or alternative intervention in reducing labor pain. An exhaustive search of the PsycINFO and PubMed databases produced 13 studies satisfying these criteria. Hetero-hypnosis and self-hypnosis were consistently shown to be more effective than standard medical care, supportive counseling, and childbirth education classes in reducing pain. Other benefits included better infant Apgar scores and shorter Stage 1 labor. Common methodological limitations of the literature include a failure to use random assignment, to specify the demographic characteristics of samples, and to use a treatment manual. Copyright © 2011 Elsevier Ltd. All rights reserved.
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
Characterizing Aeroallergens by Infrared Spectroscopy of Fungal Spores and Pollen
Zimmermann, Boris; Tkalčec, Zdenko; Mešić, Armin; Kohler, Achim
2015-01-01
Background Fungal spores and plant pollen cause respiratory diseases in susceptible individuals, such as asthma, allergic rhinitis and hypersensitivity pneumonitis. Aeroallergen monitoring networks are an important part of treatment strategies, but unfortunately traditional analysis is time consuming and expensive. We have explored the use of infrared spectroscopy of pollen and spores for an inexpensive and rapid characterization of aeroallergens. Methodology The study is based on measurement of spore and pollen samples by single reflectance attenuated total reflectance Fourier transform infrared spectroscopy (SR-ATR FTIR). The experimental set includes 71 spore (Basidiomycota) and 121 pollen (Pinales, Fagales and Poales) samples. Along with fresh basidiospores, the study has been conducted on the archived samples collected within the last 50 years. Results The spectroscopic-based methodology enables clear spectral differentiation between pollen and spores, as well as the separation of confamiliar and congeneric species. In addition, the analysis of the scattering signals inherent in the infrared spectra indicates that the FTIR methodology offers indirect estimation of morphology of pollen and spores. The analysis of fresh and archived spores shows that chemical composition of spores is well preserved even after decades of storage, including the characteristic taxonomy-related signals. Therefore, biochemical analysis of fungal spores by FTIR could provide economical, reliable and timely methodologies for improving fungal taxonomy, as well as for fungal identification and monitoring. This proof of principle study shows the potential for using FTIR as a rapid tool in aeroallergen studies. In addition, the presented method is ready to be immediately implemented in biological and ecological studies for direct measurement of pollen and spores from flowers and sporocarps. PMID:25867755
ERIC Educational Resources Information Center
Riccobono, John A.; Cominole, Melissa B.; Siegel, Peter H.; Gabel, Tim J.; Link, Michael W.; Berkner, Lutz K.
This report describes the methods and procedures used for the 2000 National Postsecondary Student Aid Study (NPSAS:2000). NPSAS:2000 included notable changes from previous NPSAS surveys (conducted in 1987, 1990, 1993, and 1996) in its sample design and collection of data. For example, this study is the first to restrict institutional sampling to…
Q-Sample Construction: A Critical Step for a Q-Methodological Study.
Paige, Jane B; Morin, Karen H
2016-01-01
Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.
Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D
2017-04-15
In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Paul R.; Boyce, Donald E.; Park, Jun-Sang
A robust methodology is presented to extract slip system strengths from lattice strain distributions for polycrystalline samples obtained from high-energy x-ray diffraction (HEXD) experiments with in situ loading. The methodology consists of matching the evolution of coefficients of a harmonic expansion of the distributions from simulation to the coefficients derived from measurements. Simulation results are generated via finite element simulations of virtual polycrystals that are subjected to the loading history applied in the HEXD experiments. Advantages of the methodology include: (1) its ability to utilize extensive data sets generated by HEXD experiments; (2) its ability to capture trends in distributionsmore » that may be noisy (both measured and simulated); and (3) its sensitivity to the ratios of the family strengths. The approach is used to evaluate the slip system strengths of Ti-6Al-4V using samples having relatively equiaxed grains. These strength estimates are compared to values in the literature.« less
Bogdanova, Yelena; Yee, Megan K; Ho, Vivian T; Cicerone, Keith D
Comprehensive review of the use of computerized treatment as a rehabilitation tool for attention and executive function in adults (aged 18 years or older) who suffered an acquired brain injury. Systematic review of empirical research. Two reviewers independently assessed articles using the methodological quality criteria of Cicerone et al. Data extracted included sample size, diagnosis, intervention information, treatment schedule, assessment methods, and outcome measures. A literature review (PubMed, EMBASE, Ovid, Cochrane, PsychINFO, CINAHL) generated a total of 4931 publications. Twenty-eight studies using computerized cognitive interventions targeting attention and executive functions were included in this review. In 23 studies, significant improvements in attention and executive function subsequent to training were reported; in the remaining 5, promising trends were observed. Preliminary evidence suggests improvements in cognitive function following computerized rehabilitation for acquired brain injury populations including traumatic brain injury and stroke. Further studies are needed to address methodological issues (eg, small sample size, inadequate control groups) and to inform development of guidelines and standardized protocols.
NASA Astrophysics Data System (ADS)
Kamnev, Alexander A.; Tugarova, Anna V.; Dyatlova, Yulia A.; Tarantilis, Petros A.; Grigoryeva, Olga P.; Fainleib, Alexander M.; De Luca, Stefania
2018-03-01
A set of experimental data obtained by Fourier transform infrared (FTIR) spectroscopy (involving the use of samples ground and pressed with KBr, i.e. in a polar halide matrix) and by matrix-free transmission FTIR or diffuse reflectance infrared Fourier transform (DRIFT) spectroscopic methodologies (involving measurements of thin films or pure powdered samples, respectively) were compared for several different biomacromolecular substances. The samples under study included poly-3-hydroxybutyrate (PHB) isolated from cell biomass of the rhizobacterium Azospirillum brasilense; dry PHB-containing A. brasilense biomass; pectin (natural carboxylated heteropolysaccharide of plant origin; obtained from apple peel) as well as its chemically modified derivatives obtained by partial esterification of its galacturonide-chain hydroxyl moieties with palmitic, oleic and linoleic acids. Significant shifts of some FTIR vibrational bands related to polar functional groups of all the biomacromolecules under study, induced by the halide matrix used for preparing the samples for spectroscopic measurements, were shown and discussed. A polar halide matrix used for preparing samples for FTIR measurements was shown to be likely to affect band positions not only per se, by affecting band energies or via ion exchange (e.g., with carboxylate moieties), but also by inducing crystallisation of metastable amorphous biopolymers (e.g., PHB of microbial origin). The results obtained have important implications for correct structural analyses of polar, H-bonded and/or amphiphilic biomacromolecular systems using different methodologies of FTIR spectroscopy.
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pilot test of new roadside survey methodology for impaired driving
DOT National Transportation Integrated Search
2007-01-01
This study developed and tested procedures to enhance roadside survey procedures to include collecting and analyzing oral fluid and blood samples from the nighttime weekend driving population. Roadside surveys involve collecting information from a ra...
Monge, Susana; Ronda, Elena; Pons-Vigués, Mariona; Vives Cases, Carmen; Malmusi, Davide; Gil-González, Diana
2015-01-01
Our objective was to describe the methodological limitations and recommendations identified by authors of original articles on immigration and health in Spain. A literature review was conducted of original articles published in Spanish or English between 1998 and 2012 combining keywords on immigration and health. A total of 311 articles were included; of these, 176 (56.6%) mentioned limitations, and 15 (4.8%) made recommendations. The most frequently mentioned limitations included the following: reduced sample sizes; internal validity and sample representativeness issues, with under- or overrepresentation of specific groups; problems of validity of the collected information and missing data mostly related to measurement tools; and absence of key variables for adjustment or stratification. Based on these results, a series of recommendations are proposed to minimise common limitations and advance the quality of scientific production on immigration and health in our setting. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.
Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Leino-Kilpi, Helena
2015-12-01
To report a review of quality regarding sampling, sample and data collection procedures of empirical nursing research of ethical climate studies where nurses were informants. Surveys are needed to obtain generalisable information about topics sensitive to nursing. Methodological quality of the studies is of key concern, especially the description of sampling and data collection procedures. Methodological literature review. Using the electronic MEDLINE database, empirical nursing research articles focusing on ethical climate were accessed in 2013 (earliest-22 November 2013). Using the search terms 'ethical' AND ('climate*' OR 'environment*') AND ('nurse*' OR 'nursing'), 376 citations were retrieved. Based on a four-phase retrieval process, 26 studies were included in the detailed analysis. Sampling method was reported in 58% of the studies, and it was random in a minority of the studies (26%). The identification of target sample and its size (92%) was reported, whereas justification for sample size was less often given. In over two-thirds (69%) of the studies with identifiable response rate, it was below 75%. A variety of data collection procedures were used with large amount of missing data about the details of who distributed, recruited and collected the questionnaires. Methods to increase response rates were seldom described. Discussion about nonresponse, representativeness of the sample and generalisability of the results was missing in many studies. This review highlights the methodological challenges and developments that need to be considered in ensuring the use of valid information in developing health care through research findings. © 2015 Nordic College of Caring Science.
Taccolini Manzoni, Ana Carolina; Bastos de Oliveira, Naiane Teixeira; Nunes Cabral, Cristina Maria; Aquaroni Ricci, Natalia
2018-02-05
The aim of this systematic review was to investigate the role of therapeutic alliance in pain relief in patients with musculoskeletal disorders treated by physiotherapy. Manual and database searches (Medline, Embase, ISI Web of Knowledge, CINAHL, PEDro, Lilacs, Cochrane Library, and PsycINFO) were performed with no restrictions of language and publication date. We included prospective studies with samples of patients undergoing physiotherapy for musculoskeletal conditions, with one measure of therapeutic alliance and the outcome pain. Methodological quality was assessed by the Methodological Index for Nonrandomized Studies and the Cochrane tool for risk of bias. Six articles from four studies were included out of the 936 manuscripts identified. All studies used samples composed of patients with chronic low back pain. Two studies applied therapeutic alliance incentive measures during treatment and reported significant improvement in pain. The remaining studies, without alliance incentives, showed divergence regarding the relationship between the therapeutic alliance and pain. Methodological quality analysis determined low risk of bias of the studies. A lack of studies on the therapeutic alliance regarding musculoskeletal physiotherapy was verified. Existing studies fail to provide evidence of a strong relationship between the therapeutic alliance and pain relief.
Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H
2015-01-08
Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.
Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH
2015-01-01
Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... contain information on water availability and alternative water sources, including the suitability of...) flooding or streamflow alteration; (D) ground water and surface water availability; and (E) other... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet...
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
North, Carol S
2005-01-01
Several methodological issues may affect the findings of studies of the mental health effects of disasters over time. These issues include analysis of the course of individual disorders over time that may be lost when they are presented embedded in general summary statistics, consideration of assessment of psychiatric disorders versus symptoms, adherence to established criteria in assigning psychiatric diagnoses, and orientation of mental health issues to the type of disaster exposure of the sample. This report will explore these methodological issues in a review of disaster literature and in data obtained from study of survivors of the Oklahoma City bombing. Clinical implications of the data obtained from the Oklahoma City bombing study of survivors of the direct bomb blast are presented in the context of these methodological concerns.
Sampling maternal care behaviour in domestic dogs: What's the best approach?
Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J
2017-07-01
Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.
Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc
2013-08-02
In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.
Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C.; Gross, Alden L.; Hofer, Scott M.; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M. Maria; Dufouil, Carole
2015-01-01
Clinical and population research on dementia and related neurologic conditions, including Alzheimer’s disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on “best practices.” We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. PMID:26397878
Cwikel, Julie; Hoban, Elizabeth
2005-11-01
The trafficking of women and children for work in the globalized sex industry is a global social problem. Quality data is needed to provide a basis for legislation, policy, and programs, but first, numerous research design, ethical, and methodological problems must be addressed. Research design issues in studying women trafficked for sex work (WTSW) include how to (a) develop coalitions to fund and support research, (b) maintain a critical stance on prostitution, and therefore WTSW (c) use multiple paradigms and methods to accurately reflect WTSW's reality, (d) present the purpose of the study, and (e) protect respondents' identities. Ethical issues include (a) complications with informed consent procedures, (b) problematic access to WTSW (c) loss of WTSW to follow-up, (d) inability to intervene in illegal acts or human rights violations, and (e) the need to maintain trustworthiness as researchers. Methodological issues include (a) constructing representative samples, (b) managing media interest, and (c) handling incriminating materials about law enforcement and immigration.
Bill Raynor; Roger R. Bay
1993-01-01
Includes 19 papers presented at the workshop, covering such topics as sampling techniques and statistical considerations, indigenous agricultural and agroforestry systems, crop testing and evaluation, and agroforestry practices in the Pacific Islands, including Micronesia, Northern Marianas Islands, Palau, and American Samoa.
2011-02-01
24 Figure 21. Stress-Strain Curve of Expanded Polystyrene Insulation Foam Samples ....................25 Figure 22. Stress-Strain Curve of...Polyisocyanruate Insulation Foam Samples ............................25 Figure 23. Stress-Strain Curve of Extruded Expanded Polystyrene Insulation Foam...for modeling (Naito et al. 2009a). Insulating foams included expanded polystyrene (EPS), extruded expanded polystyrene (XPS), and polyisocyanurate
A critical methodological review of discourse and conversation analysis studies of family therapy.
Tseliou, Eleftheria
2013-12-01
Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment. © FPI, Inc.
Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan
2013-01-01
Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329
Dawson, Paul R.; Boyce, Donald E.; Park, Jun-Sang; ...
2017-10-15
A robust methodology is presented to extract slip system strengths from lattice strain distributions for polycrystalline samples obtained from high-energy x-ray diffraction (HEXD) experiments with in situ loading. The methodology consists of matching the evolution of coefficients of a harmonic expansion of the distributions from simulation to the coefficients derived from measurements. Simulation results are generated via finite element simulations of virtual polycrystals that are subjected to the loading history applied in the HEXD experiments. Advantages of the methodology include: (1) its ability to utilize extensive data sets generated by HEXD experiments; (2) its ability to capture trends in distributionsmore » that may be noisy (both measured and simulated); and (3) its sensitivity to the ratios of the family strengths. The approach is used to evaluate the slip system strengths of Ti-6Al-4V using samples having relatively equiaxed grains. These strength estimates are compared to values in the literature.« less
Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett
2014-01-01
Background Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Methods Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Results Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Conclusion Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM. PMID:25409440
Zablotska, Iryna B; Frankland, Andrew; Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett
2014-01-01
Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM.
Effects of MicroCAD on Learning Fundamental Engineering Graphical Concepts: A Qualitative Study.
ERIC Educational Resources Information Center
Leach, James A.; Gull, Randall L.
1990-01-01
Students' reactions and performances were examined when taught engineering geometry concepts using a standard microcomputer-aided drafting software package. Two sample groups were compared based on their computer experience. Included are the methodology, data analysis, and conclusions. (KR)
The Multigroup Multilevel Categorical Latent Growth Curve Models
ERIC Educational Resources Information Center
Hung, Lai-Fa
2010-01-01
Longitudinal data describe developmental patterns and enable predictions of individual changes beyond sampled time points. Major methodological issues in longitudinal data include modeling random effects, subject effects, growth curve parameters, and autoregressive residuals. This study embedded the longitudinal model within a multigroup…
How completely are physiotherapy interventions described in reports of randomised trials?
Yamato, Tiê P; Maher, Chris G; Saragiotto, Bruno T; Hoffmann, Tammy C; Moseley, Anne M
2016-06-01
Incomplete descriptions of interventions are a common problem in reports of randomised controlled trials. To date no study has evaluated the completeness of the descriptions of physiotherapy interventions. To evaluate the completeness of the descriptions of physiotherapy interventions in a random sample of reports of randomised controlled trials (RCTs). A random sample of 200 reports of RCTs from the PEDro database. We included full text papers, written in English, and reporting trials with two arms. We included trials evaluating any type of physiotherapy interventions and subdisciplines. The methodological quality was evaluated using the PEDro scale and completeness of intervention description using the Template for Intervention Description and Replication (TIDieR) checklist. The proportion and 95% confidence interval were calculated for intervention and control groups, and used to present the relationship between completeness and methodological quality, and subdisciplines. Completeness of intervention reporting in physiotherapy RCTs was poor. For intervention groups, 46 (23%) trials did not describe at least half of the items. Reporting was worse for control groups, 149 (75%) trials described less than half of the items. There was no clear difference in the completeness across subdisciplines or methodological quality. Our sample were restricted to trials published in English in 2013. Descriptions of interventions in physiotherapy RCTs are typically incomplete. Authors and journals should aim for more complete descriptions of interventions in physiotherapy trials. Copyright © 2016 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
2013-01-01
Background Insomnia is a widespread human health problem, but there currently are the limitations of conventional therapies available. Suanzaoren decoction (SZRD) is a well known classic Chinese herbal prescription for insomnia and has been treating people’s insomnia for more than thousand years. The objective of this study was to evaluate the efficacy and safety of SZRD for insomnia. Methods A systematic literature search was performed for 6 databases up to July of 2012 to identify randomized control trials (RCTs) involving SZRD for insomniac patients. The methodological quality of RCTs was assessed independently using the Cochrane Handbook for Systematic Reviews of Interventions. Results Twelve RCTs with total of 1376 adult participants were identified. The methodological quality of all included trials are no more than 3/8 score. Majority of the RCTs concluded that SZRD was more significantly effective than benzodiazepines for treating insomnia. Despite these positive outcomes, there were many methodological shortcomings in the studies reviewed, including insufficient information about randomization generation and absence of allocation concealment, lack of blinding and no placebo control, absence of intention-to-treat analysis and lack of follow-ups, selective publishing and reporting, and small number of sample sizes. A number of clinical heterogeneity such as diagnosis, intervention, control, and outcome measures were also reviewed. Only 3 trials reported adverse events, whereas the other 9 trials did not provide the safety information. Conclusions Despite the apparent reported positive findings, there is insufficient evidence to support efficacy of SZRD for insomnia due to the poor methodological quality and the small number of trials of the included studies. SZRD seems generally safe, but is insufficient evidence to make conclusions on the safety because fewer studies reported the adverse events. Further large sample-size and well-designed RCTs are needed. PMID:23336848
Adolphus, Katie; Bellissimo, Nick; Lawton, Clare L; Ford, Nikki A; Rains, Tia M; Totosy de Zepetnek, Julia; Dye, Louise
2017-01-01
Breakfast is purported to confer a number of benefits on diet quality, health, appetite regulation, and cognitive performance. However, new evidence has challenged the long-held belief that breakfast is the most important meal of the day. This review aims to provide a comprehensive discussion of the key methodological challenges and considerations in studies assessing the effect of breakfast on cognitive performance and appetite control, along with recommendations for future research. This review focuses on the myriad challenges involved in studying children and adolescents specifically. Key methodological challenges and considerations include study design and location, sampling and sample section, choice of objective cognitive tests, choice of objective and subjective appetite measures, merits of providing a fixed breakfast compared with ad libitum, assessment and definition of habitual breakfast consumption, transparency of treatment condition, difficulty of isolating the direct effects of breakfast consumption, untangling acute and chronic effects, and influence of confounding variables. These methodological challenges have hampered a clear substantiation of the potential positive effects of breakfast on cognition and appetite control and contributed to the debate questioning the notion that breakfast is the most important meal of the day. © 2017 American Society for Nutrition.
Bellissimo, Nick; Ford, Nikki A; Rains, Tia M
2017-01-01
Breakfast is purported to confer a number of benefits on diet quality, health, appetite regulation, and cognitive performance. However, new evidence has challenged the long-held belief that breakfast is the most important meal of the day. This review aims to provide a comprehensive discussion of the key methodological challenges and considerations in studies assessing the effect of breakfast on cognitive performance and appetite control, along with recommendations for future research. This review focuses on the myriad challenges involved in studying children and adolescents specifically. Key methodological challenges and considerations include study design and location, sampling and sample section, choice of objective cognitive tests, choice of objective and subjective appetite measures, merits of providing a fixed breakfast compared with ad libitum, assessment and definition of habitual breakfast consumption, transparency of treatment condition, difficulty of isolating the direct effects of breakfast consumption, untangling acute and chronic effects, and influence of confounding variables. These methodological challenges have hampered a clear substantiation of the potential positive effects of breakfast on cognition and appetite control and contributed to the debate questioning the notion that breakfast is the most important meal of the day. PMID:28096143
French, Deborah; Smith, Andrew; Powers, Martin P; Wu, Alan H B
2011-08-17
Binding of a ligand to the epidermal growth factor receptor (EGFR) stimulates various intracellular signaling pathways resulting in cell cycle progression, proliferation, angiogenesis and apoptosis inhibition. KRAS is involved in signaling pathways including RAF/MAPK and PI3K and mutations in this gene result in constitutive activation of these pathways, independent of EGFR activation. Seven mutations in codons 12 and 13 of KRAS comprise around 95% of the observed human mutations, rendering monoclonal antibodies against EGFR (e.g. cetuximab and panitumumab) useless in treatment of colorectal cancer. KRAS mutation testing by two different methodologies was compared; Sanger sequencing and AutoGenomics INFINITI® assay, on DNA extracted from colorectal cancers. Out of 29 colorectal tumor samples tested, 28 were concordant between the two methodologies for the KRAS mutations that were detected in both assays with the INFINITI® assay detecting a mutation in one sample that was indeterminate by Sanger sequencing and a third methodology; single nucleotide primer extension. This study indicates the utility of the AutoGenomics INFINITI® methodology in a clinical laboratory setting where technical expertise or access to equipment for DNA sequencing does not exist. Copyright © 2011 Elsevier B.V. All rights reserved.
2005-06-20
methodologies and partnership projects developed under the ONR Effect of Sound in the Marine Environment (ESME) Program. The effort involved an integration...computational models to predict audiograms for these species. National Security These data will assist in designing effective noise mitigation measures and...includes marine species for which there are reliable hearing data as well as sample sources with appropriate distance effects in their renditions, including
Showmaker, Kurt; Lawrence, Gary W.; Lu, Shien; Balbalian, Clarissa; Klink, Vincent P.
2011-01-01
A quantitative PCR procedure targeting the β-tubulin gene determined the number of Rotylenchulus reniformis Linford & Oliveira 1940 in metagenomic DNA samples isolated from soil. Of note, this outcome was in the presence of other soil-dwelling plant parasitic nematodes including its sister genus Helicotylenchus Steiner, 1945. The methodology provides a framework for molecular diagnostics of nematodes from metagenomic DNA isolated directly from soil. PMID:22194958
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
Methodological integrative review of the work sampling technique used in nursing workload research.
Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael
2014-11-01
To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.
Methodological Issues in the Classification of Attention-Related Disorders.
ERIC Educational Resources Information Center
Fletcher, Jack M.; And Others
1991-01-01
For successful classification of children with attention deficit-hyperactivity disorder, major issues include (1) the need for explicit studies of identification criteria; (2) the need for systematic sampling strategies; (3) development of hypothetical classifications; and (4) systematic assessment of reliability and validity of hypothetical…
Yoga in the schools: a systematic review of the literature.
Serwacki, Michelle L; Cook-Cottone, Catherine
2012-01-01
The objective of this research was to examine the evidence for delivering yoga-based interventions in schools. An electronic literature search was conducted to identify peer-reviewed, published studies in which yoga and a meditative component (breathing practices or meditation) were taught to youths in a school setting. Pilot studies, single cohort, quasi-experimental, and randomized clinical trials were considered. quality was evaluated and summarized. Twelve published studies were identified. Samples for which yoga was implemented as an intervention included youths with autism, intellectual disability, learning disability, and emotional disturbance, as well as typically developing youths. Although effects of participating in school-based yoga programs appeared to be beneficial for the most part, methodological limitations, including lack of randomization, small samples, limited detail regarding the intervention, and statistical ambiguities curtailed the ability to provide definitive conclusions or recommendations. Findings speak to the need for greater methodological rigor and an increased understanding of the mechanisms of success for school-based yoga interventions.
Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina
2018-06-01
While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.
GEOTHERMAL EFFLUENT SAMPLING WORKSHOP
This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.
Studies on the Presence of Mycotoxins in Biological Samples: An Overview
Escrivá, Laura; Font, Guillermina; Manyes, Lara
2017-01-01
Mycotoxins are fungal secondary metabolites with bioaccumulation levels leading to their carry-over into animal fluids, organs, and tissues. As a consequence, mycotoxin determination in biological samples from humans and animals has been reported worldwide. Since most mycotoxins show toxic effects at low concentrations and considering the extremely low levels present in biological samples, the application of reliable detection methods is required. This review summarizes the information regarding the studies involving mycotoxin determination in biological samples over the last 10 years. Relevant data on extraction methodology, detection techniques, sample size, limits of detection, and quantitation are presented herein. Briefly, liquid-liquid extraction followed by LC-MS/MS determination was the most common technique. The most analyzed mycotoxin was ochratoxin A, followed by zearalenone and deoxynivalenol—including their metabolites, enniatins, fumonisins, aflatoxins, T-2 and HT-2 toxins. Moreover, the studies were classified by their purpose, mainly focused on the development of analytical methodologies, mycotoxin biomonitoring, and exposure assessment. The study of tissue distribution, bioaccumulation, carry-over, persistence and transference of mycotoxins, as well as, toxicokinetics and ADME (absorption, distribution, metabolism and excretion) were other proposed goals for biological sample analysis. Finally, an overview of risk assessment was discussed. PMID:28820481
Understanding Microplastic Distribution: A Global Citizen Monitoring Effort
NASA Astrophysics Data System (ADS)
Barrows, A.
2016-02-01
Understanding distribution and abundance of microplastics in the world's oceans will continue to help inform global law-making. Through recruiting and training over 500 volunteers our study has collected over 1000 samples from remote and populated areas world-wide. Samples include water collected at the sea surface and throughout the water column. Surface to depth sampling has provided insight into vertical plastic distribution. The development of unique field and laboratory methodology has enabled plastics to be quantified down to 50 µm. In 2015, the study expanded to include global freshwater systems. By understanding plastic patterns, distribution and concentration in large and small watersheds we will better understand how freshwater systems are contributing to marine microplastic pollution.
Is there an "abortion trauma syndrome"? Critiquing the evidence.
Robinson, Gail Erlick; Stotland, Nada L; Russo, Nancy Felipe; Lang, Joan A; Occhiogrosso, Mallay
2009-01-01
The objective of this review is to identify and illustrate methodological issues in studies used to support claims that induced abortion results in an "abortion trauma syndrome" or a psychiatric disorder. After identifying key methodological issues to consider when evaluating such research, we illustrate these issues by critically examining recent empirical studies that are widely cited in legislative and judicial testimony in support of the existence of adverse psychiatric sequelae of induced abortion. Recent studies that have been used to assert a causal connection between abortion and subsequent mental disorders are marked by methodological problems that include, but not limited to: poor sample and comparison group selection; inadequate conceptualization and control of relevant variables; poor quality and lack of clinical significance of outcome measures; inappropriateness of statistical analyses; and errors of interpretation, including misattribution of causal effects. By way of contrast, we review some recent major studies that avoid these methodological errors. The most consistent predictor of mental disorders after abortion remains preexisting disorders, which, in turn, are strongly associated with exposure to sexual abuse and intimate violence. Educating researchers, clinicians, and policymakers how to appropriately assess the methodological quality of research about abortion outcomes is crucial. Further, methodologically sound research is needed to evaluate not only psychological outcomes of abortion, but also the impact of existing legislation and the effects of social attitudes and behaviors on women who have abortions.
Berkhout, Daniel J. C.; Benninga, Marc A.; van Stein, Ruby M.; Brinkman, Paul; Niemarkt, Hendrik J.; de Boer, Nanne K. H.; de Meij, Tim G. J.
2016-01-01
Prior to implementation of volatile organic compound (VOC) analysis in clinical practice, substantial challenges, including methodological, biological and analytical difficulties are faced. The aim of this study was to evaluate the influence of several sampling conditions and environmental factors on fecal VOC profiles, analyzed by an electronic nose (eNose). Effects of fecal sample mass, water content, duration of storage at room temperature, fecal sample temperature, number of freeze–thaw cycles and effect of sampling method (rectal swabs vs. fecal samples) on VOC profiles were assessed by analysis of totally 725 fecal samples by means of an eNose (Cyranose320®). Furthermore, fecal VOC profiles of totally 1285 fecal samples from 71 infants born at three different hospitals were compared to assess the influence of center of origin on VOC outcome. We observed that all analyzed variables significantly influenced fecal VOC composition. It was feasible to capture a VOC profile using rectal swabs, although this differed significantly from fecal VOC profiles of similar subjects. In addition, 1285 fecal VOC-profiles could significantly be discriminated based on center of birth. In conclusion, standardization of methodology is necessary before fecal VOC analysis can live up to its potential as diagnostic tool in clinical practice. PMID:27886068
ORGANIC COMPOUNDS IN SURFACE SEDIMENTS AND OYSTER TISSUES FROM THE CHESAPEAKE BAY. APPENDICES
Detailed in the first part of this report is a development and discussion of the methodology used to extract and analyze sediment and oyster tissue samples from Chesapeake Bay for organic compounds. The method includes extraction, fractionation, and subsequent analysis using glas...
Factors Influencing Self-Directed Career Management: An Integrative Investigation
ERIC Educational Resources Information Center
Park, Yongho
2009-01-01
Purpose: This paper aims to investigate the relationship between the protean career and other variables, including organizational learning climate, individual calling work orientation, and demographic variables. Design/methodology/approach: The research data were obtained from a sample consisting of 292 employees of two South Korean manufacturing…
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
Burbach, D J; Peterson, L
1986-01-01
Cognitive-developmental studies relevant to children's concepts of physical illness are reviewed and critiqued. Although numerous methodological weaknesses make firm conclusions difficult, most data appear to suggest that children's concepts of illness do evolve in a systematic and predictable sequence consistent with Piaget's theory of cognitive development. Methodological weaknesses identified include poor description of samples, assessment instruments, and procedures; lack of control over potential observer bias, expectancy effects, and other confounding variables; and minimal attention to reliability and validity issues. Increased methodological rigor and a further explication of the specific and unique ways in which children's concepts of illness develop over the course of cognitive development could substantially increase the value of these studies for professionals in pediatric health care settings.
An introduction to exemplar research: a definition, rationale, and conceptual issues.
Bronk, Kendall Cotton; King, Pamela Ebstyne; Matsuba, M Kyle
2013-01-01
The exemplar methodology represents a useful yet underutilized approach to studying developmental constructs. It features an approach to research whereby individuals, entities, or programs that exemplify the construct of interest in a particularly intense or highly developed manner compose the study sample. Accordingly, it reveals what the upper ends of development look like in practice. Utilizing the exemplar methodology allows researchers to glimpse not only what is but also what is possible with regard to the development of a particular characteristic. The present chapter includes a definition of the exemplar methodology, a discussion of some of key conceptual issues to consider when employing it in empirical studies, and a brief overview of the other chapters featured in this volume. © Wiley Periodicals, Inc.
Methodological strategies in using home sleep apnea testing in research and practice.
Miller, Jennifer N; Schulz, Paula; Pozehl, Bunny; Fiedler, Douglas; Fial, Alissa; Berger, Ann M
2017-11-14
Home sleep apnea testing (HSAT) has increased due to improvements in technology, accessibility, and changes in third party reimbursement requirements. Research studies using HSAT have not consistently reported procedures and methodological challenges. This paper had two objectives: (1) summarize the literature on use of HSAT in research of adults and (2) identify methodological strategies to use in research and practice to standardize HSAT procedures and information. Search strategy included studies of participants undergoing sleep testing for OSA using HSAT. MEDLINE via PubMed, CINAHL, and Embase with the following search terms: "polysomnography," "home," "level III," "obstructive sleep apnea," and "out of center testing." Research articles that met inclusion criteria (n = 34) inconsistently reported methods and methodological challenges in terms of: (a) participant sampling; (b) instrumentation issues; (c) clinical variables; (d) data processing; and (e) patient acceptability. Ten methodological strategies were identified for adoption when using HSAT in research and practice. Future studies need to address the methodological challenges summarized in this paper as well as identify and report consistent HSAT procedures and information.
Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C; Gross, Alden L; Hofer, Scott M; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M Maria; Dufouil, Carole
2015-09-01
Clinical and population research on dementia and related neurologic conditions, including Alzheimer's disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on "best practices." We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Is a 'convenience' sample useful for estimating immunization coverage in a small population?
Weir, Jean E; Jones, Carrie
2008-01-01
Rapid survey methodologies are widely used for assessing immunization coverage in developing countries, approximating true stratified random sampling. Non-random ('convenience') sampling is not considered appropriate for estimating immunization coverage rates but has the advantages of low cost and expediency. We assessed the validity of a convenience sample of children presenting to a travelling clinic by comparing the coverage rate in the convenience sample to the true coverage established by surveying each child in three villages in rural Papua New Guinea. The rate of DTF immunization coverage as estimated by the convenience sample was within 10% of the true coverage when the proportion of children in the sample was two-thirds or when only children over the age of one year were counted, but differed by 11% when the sample included only 53% of the children and when all eligible children were included. The convenience sample may be sufficiently accurate for reporting purposes and is useful for identifying areas of low coverage.
77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...
Design of clinical trials of antidepressants: should a placebo control arm be included?
Fritze, J; Möller, H J
2001-01-01
There is no doubt that available antidepressants are efficacious and effective. Nevertheless, more effective drugs with improved tolerability are needed. With this need in mind, some protagonists claim that future antidepressants should be proved superior to, or at least as effective as, established antidepressants, making placebo control methodologically dispensable in clinical trials. Moreover, the use of placebo control is criticised as unethical because it might result in effective treatment being withheld. There are, however, a number of methodological reasons why placebo control is indispensable for the proof of efficacy of antidepressants. Comparing investigational antidepressants only with standard antidepressants and not placebo yields ambiguous results that are difficult to interpret, be it in superiority or equivalence testing, and this method of assessment requires larger sample sizes than those required with the use of placebo control. Experimental methodology not adhering to the optimal study design is ethically questionable. Restricting the testing of investigational antidepressants only to superiority over standard antidepressants is an obstacle to therapeutic progress in terms of tolerability and the detection of innovative mechanisms of action from which certain subgroups of future patients might benefit. The use of a methodology that requires larger samples for testing of superiority or equivalence is also ethically questionable. In view of the high placebo response rates in trials of antidepressants, placebo treatment does not mean withholding effective treatment. Accepting the necessity of the clinical evaluation of new, potentially ineffective antidepressants implicitly means accepting placebo control as ethically justified. Three- or multi-arm comparisons including placebo and an active reference represent the optimal study design.
Developing a Rubric to Assess Student Learning Outcomes Using a Class Assignment
ERIC Educational Resources Information Center
Thaler, Nicholas; Kazemi, Ellie; Huscher, Crystal
2009-01-01
We developed a rubric to assess several of our department's undergraduate student learning outcomes (SLOs). Target SLOs include applications of principles of research methodology, using appropriate statistics, adherence to the Publication Manual of the American Psychological Association, and written communication skills. We randomly sampled 20…
The Stanford Prison Experiment in Introductory Psychology Textbooks: A Content Analysis
ERIC Educational Resources Information Center
Bartels, Jared M.
2015-01-01
The present content analysis examines the coverage of theoretical and methodological problems with the Stanford prison experiment (SPE) in a sample of introductory psychology textbooks. Categories included the interpretation and replication of the study, variance in guard behavior, participant selection bias, the presence of demand characteristics…
Oracle or Monacle: Research Concerning Attitudes Toward Feminism.
ERIC Educational Resources Information Center
Prescott, Suzanne; Schmid, Margaret
Both popular studies and more serious empirical studies of attitudes toward feminism are reviewed beginning with Clifford Kirkpatrick's early empirical work and including the more recent empirical studies completed since 1970. The review examines the contents of items used to measure feminism, and the methodology and sampling used in studies, as…
USDA-ARS?s Scientific Manuscript database
Cellulolytic bacteria and lactobacilli are beneficial microbes in the equine hindgut. There are several existing methodologies for the enumeration of these bacteria, which vary based on selective and differential media and sample handling procedures including storage time and temperature. The object...
Living Green: Examining Sustainable Dorms and Identities
ERIC Educational Resources Information Center
Watson, Lesley; Johnson, Cathryn; Hegtvedt, Karen A.; Parris, Christie L.
2015-01-01
Purpose: The purpose of this study was to examine the effects of living in "green" dorms on students' environmentally responsible behaviors (ERBs), in concert with other factors, including individual identity and social context in the form of behavior modeling by peers. Design/methodology/approach: The sample of 243 consists of students…
Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.
Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C
2013-04-01
Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.
Orellano-Colón, Elsa M; Jutai, Jeffrey; Santiago, Angélica; Torres, Víctor; Benítez, Keyla; Torres, Mayra
2016-09-01
(1) Knowledge about the assistive technology (AT) needs and psychosocial impact of AT in different populations is needed because the adoption, retention, or abandonment of AT may be influenced by the psychosocial impact that AT has on its users. The aims of this study were to: (a) identify the AT needs of a sample of Hispanic older adults with functional limitations, (b) describe the psychosocial impact of these technologies on the sample's quality of life, and (c) describe the methodological challenges in using the Puerto Rican version of the Psychosocial Impact of Assistive Device Scale (PR-PIADS) with a Hispanic sample. (2) Methods: This study used a cross-sectional design conducted with a sample of 60 participants. Data was collected using the Assistive Technology Card Assessment Questionnaire (ATCAQ) and the PR-PIADS. Data analyses included descriptive statistics and bivariate analysis. (3) Results: The sample's most frequently reported needs for AT devices were in the areas of cooking, home tasks, and home safety activities. The sample reported a positive impact of AT use in their quality of life. Several methodological challenges of the PIADS were identified. (4) Conclusions: The sample has unmet needs for using AT devices to overcome difficulties in daily living activities.
Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.
Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683
NASA Technical Reports Server (NTRS)
Hallum, C. R.; Basu, J. P. (Principal Investigator)
1979-01-01
A natural stratum-based sampling scheme and the aggregation procedures for estimating wheat area, yield, and production and their associated prediction error estimates are described. The methodology utilizes LANDSAT imagery and agrophysical data to permit an improved stratification in foreign areas by ignoring political boundaries and restratifying along boundaries that are more homogeneous with respect to the distribution of agricultural density, soil characteristics, and average climatic conditions. A summary of test results is given including a discussion of the various problems encountered.
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
Determination of tocopherols and sitosterols in seeds and nuts by QuEChERS-liquid chromatography.
Delgado-Zamarreño, M Milagros; Fernández-Prieto, Cristina; Bustamante-Rangel, Myriam; Pérez-Martín, Lara
2016-02-01
In the present work a simple, reliable and affordable sample treatment method for the simultaneous analysis of tocopherols and free phytosterols in nuts was developed. Analyte extraction was carried out using the QuEChERS methodology and analyte separation and detection were accomplished using HPLC-DAD. The use of this methodology for the extraction of natural occurring substances provides advantages such as speed, simplicity and ease of use. The parameters evaluated for the validation of the method developed included the linearity of the calibration plots, the detection and quantification limits, repeatability, reproducibility and recovery. The proposed method was successfully applied to the analysis of tocopherols and free phytosterols in samples of almonds, cashew nuts, hazelnuts, peanuts, tiger nuts, sun flower seeds and pistachios. Copyright © 2015 Elsevier Ltd. All rights reserved.
On the effect of model parameters on forecast objects
NASA Astrophysics Data System (ADS)
Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott
2018-04-01
Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map
. The field for some quantities generally consists of spatially coherent and disconnected objects
. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output
of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.
Extending religion-health research to secular minorities: issues and concerns.
Hwang, Karen; Hammer, Joseph H; Cragun, Ryan T
2011-09-01
Claims about religion's beneficial effects on physical and psychological health have received substantial attention in popular media, but empirical support for these claims is mixed. Many of these claims are tenuous because they fail to address basic methodological issues relating to construct validity, sampling methods or analytical problems. A more conceptual problem has to do with the near universal lack of atheist control samples. While many studies include samples of individuals classified as "low spirituality" or religious "nones", these groups are heterogeneous and contain only a fraction of members who would be considered truly secular. We illustrate the importance of including an atheist control group whenever possible in the religiosity/spirituality and health research and discuss areas for further investigation.
Game-based interventions and their impact on dementia: a narrative review.
Zheng, Jiaying; Chen, Xueping; Yu, Ping
2017-12-01
The aim of this review was to examine the efficacy of game-based interventions for people with dementia. Seven studies that met the inclusion criteria were found in four databases. Their interventions and key findings were analysed and synthesised. Game-based interventions for people with dementia are showing promise for improving cognition, coordination and behavioural and psychological symptoms. The generalisability of the findings is limited by weak methodology and small sample size. Game-based interventions can improve cognition, coordination and behavioural and psychological symptoms for people with dementia. Future research should include methodological improvement and practice guideline development.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
Dent, Andrew W; Asadpour, Ali; Weiland, Tracey J; Paltridge, Debbie
2008-02-01
Fellows of the Australasian College for Emergency Medicine (FACEM) have opportunities to participate in a range of continuing professional development activities. To inform FACEM and assist those involved in planning continuing professional development interventions for FACEM, we undertook a learning needs analysis of emergency physicians. Exploratory study using survey methodology. Following questionnaire development by iterative feedback with emergency physicians and researchers, a mailed survey was distributed to all FACEM. The survey comprised eight items on work and demographic characteristics of FACEM, and 194 items on attitudes to existing learning opportunities, barriers to learning, and perceived learning needs and preferences. Fifty-eight percent (503/854) of all FACEM surveyed responded to the questionnaire, almost half of whom attained their FACEM after year 2000. The sample comprised mostly males (72.8%) with mean age of the sample 41.6 years, similar to ACEM database. Most respondents reported working in ACEM accredited hospitals (89%), major referral hospitals (54%), and practiced on both children and adults (78%). FACEM reported working on average 26.7 clinical hours per week with those at private hospitals working a greater proportion of clinical hours than other hospital types. As the first of six related reports, this paper documents the methodology used, including questionnaire development, and provides the demographics of responding FACEM, including the clinical and non-clinical hours worked and type of hospital of principal employment.
Roth, Alexis M; Rosenberger, Joshua G; Reece, Michael; Van Der Pol, Barbara
2012-02-01
Transactional sex has been associated with increased risk of adverse health outcomes, including sexually transmitted infections (STIs). Participants included female sex workers and men they recruited utilizing incentivized snowball sampling. Participants provided specimens for STI diagnostic testing and completed a semi-structured interview. Forty-four participants aged 19-65 were interviewed. Participants found self-sampling to be acceptable and overwhelmingly endorsed sampling outside of a clinic (90%) for reasons such as convenience, privacy, and lack of stigma. A substantial minority (38%) tested positive for at least one STI. Novel strategies may encourage sexual health care and prevent STIs among sex workers. High infection and screening acceptance rates across the sample suggests that individuals engaged in transactional sex would benefit from, and would be responsive to, community-based self-sampling for STI screening.
Language Trends 2010 Secondary (CILT, ALL, ISMLA) Data Report
ERIC Educational Resources Information Center
CILT, the National Centre for Languages, 2011
2011-01-01
This survey has been carried out annually since 2002 to track developments in language provision and take up in secondary schools. The following sections are included in this analysis of key results from the Secondary Language Trends survey: (1) Survey background, methodology and sample design; (2) DfE (Department for Education) data trends of…
The Social Experiences of High School Students with Visual Impairments
ERIC Educational Resources Information Center
Jessup, Glenda; Bundy, Anita C.; Broom, Alex; Hancock, Nicola
2017-01-01
Introduction: This study explores the social experiences in high school of students with visual impairments. Methods: Experience sampling methodology was used to examine (a) how socially included students with visual impairments feel, (b) the internal qualities of their activities, and (c) the factors that influence a sense of inclusion. Twelve…
Documentation for the 2003-04 Schools and Staffing Survey. NCES 2007-337
ERIC Educational Resources Information Center
Tourkin, Steven C.; Warner, Toni; Parmer, Randall; Cole, Cornette; Jackson, Betty; Zukerberg, Andrew; Cox, Shawna; Soderberg, Andrew
2007-01-01
This report serves as the survey documentation for the design and implementation of the 2003-04 Schools and Staffing Survey. Topics covered include the sample design, survey methodology, data collection procedures, data processing, response rates, imputation procedures, weighting and variance estimation, review of the quality of data, the types of…
ERIC Educational Resources Information Center
Lee, Silvia Wen-Yu; Tsai, Chin-Chung
2013-01-01
We conducted a literature review of using educational technology in biology learning from 2001 to 2010. A total of 36 empirical articles were included for review. Based upon the content analyses of these studies, such as technologies utilized, student sample, biological topics involved, the research purpose, and methodology, the following…
Lund, Heidi Sjetne; Skogtun, Gaute; Sørum, Henning; Eggertsdóttir, Anna Vigdís
2015-10-01
A diagnosis of bacterial cystitis commonly relies on a positive microbiological culture demonstrating the presence of a significant number of colony-forming units/ml urine, as urine within the upper urinary tract, bladder and proximal urethra generally is considered sterile. Recent studies from human and veterinary medicine indicate the presence of non-culturable bacteria in culture-negative urine samples. The aim of the present study was to determine the occurrence of bacterial DNA in culture-negative urine samples from cats with signs of feline lower urinary tract disease (FLUTD) and healthy control cats by 16S ribosomal DNA PCR and subsequent sequencing. The study sample included 38 culture-negative urine samples from cats with FLUTD and 43 culture-negative samples from control cats. Eight culture-positive urine samples from cats with FLUTD were included as external positive controls in addition to negative reaction controls. Of possible methodological limitations, degradation of DNA due to storage, the use of non-sedimented urine for DNA isolation and lack of internal positive reaction controls should be mentioned. The positive controls were recognised, but occurrence of bacterial DNA in culture-negative urine from cats with or without signs of lower urinary tract disease was not demonstrated. However, considering the possible methodological limitations, the presence of bacterial DNA in the urine of culture-negative FLUTD cats cannot be excluded based on the present results alone. Therefore, a prospective study reducing the possibility of degradation of DNA due to storage, in combination with modifications enhancing the chance of detecting even lower levels of bacterial DNA in culture-negative samples, seems warranted. © ISFM and AAFP 2014.
Not Your Same Old Story: New Rules for Thematic Apperceptive Techniques (TATs).
Jenkins, Sharon Rae
2017-01-01
Stories told about pictures have been used for both research and clinical practice since the beginning of modern personality assessment. However, with the growing science-practice gap, these thematic apperceptive techniques (TATs) have been used differently in those 2 venues. Scientific validation is presumptively general, but clinical application is idiographic and situation-specific. A bridge is needed. The manualized human-scored narrative analysis systems discussed here are valuable scientist-practitioner tools, but they require a validation literature to support further research publication, maintain their role in clinical training, and justify clinicians' reimbursement by third-party payers. To facilitate wider understanding of manualized TAT methodologies, this article addresses long-standing criticisms of TAT reliability and proposes some strategic solutions to the measurement error problem for both researchers and clinicians, including analyzing person-situation interactions, purposeful situation sampling for within-storyteller comparisons, and uses of small samples. The new rules for TATs include conceptual and methodological standards that researchers should aim to meet and report, reviewers should apply to manuscripts, and clinical assessors can use to analyze their own data and justify third-party payment.
Stigma-related experiences in non-communicable respiratory diseases: A systematic review.
Rose, Shiho; Paul, Christine; Boyes, Allison; Kelly, Brian; Roach, Della
2017-08-01
The stigma of non-communicable respiratory diseases (NCRDs), whether perceived or otherwise, can be an important element of a patient's experience of his/her illness and a contributing factor to poor psychosocial, treatment and clinical outcomes. This systematic review examines the evidence regarding the associations between stigma-related experiences and patient outcomes, comparing findings across a range of common NCRDs. Electronic databases and manual searches were conducted to identify original quantitative research published to December 2015. Articles focussing on adult patient samples diagnosed with asthma, chronic obstructive pulmonary disease (COPD), cystic fibrosis, lung cancer or mesothelioma, and included a measurement of stigma-related experience (i.e. perceived stigma, shame, blame or guilt), were eligible for inclusion. Included articles were described for study characteristics, outcome scores, correlates between stigma-related experiences and patient outcomes and methodological rigor. Twenty-five articles were eligible for this review, with most ( n = 20) related to lung cancer. No articles for cystic fibrosis were identified. Twenty unique scales were used, with low to moderate stigma-related experiences reported overall. The stigma-related experiences significantly correlated with all six patient-related domains explored (psychosocial, quality of life, behavioral, physical, treatment and work), which were investigated more widely in COPD and lung cancer samples. No studies adequately met all criteria for methodological rigor. The inter-connectedness of stigma-related experiences to other aspects of patient experiences highlight that an integrated approach is needed to address this important issue. Future studies should adopt more rigorous methodology, including streamlining measures, to provide robust evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Adam
2015-01-01
This thesis presents work on advancements and applications of methodology for the analysis of biological samples using mass spectrometry. Included in this work are improvements to chemical cross-linking mass spectrometry (CXMS) for the study of protein structures and mass spectrometry imaging and quantitative analysis to study plant metabolites. Applications include using matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) to further explore metabolic heterogeneity in plant tissues and chemical interactions at the interface between plants and pests. Additional work was focused on developing liquid chromatography-mass spectrometry (LC-MS) methods to investigate metabolites associated with plant-pest interactions.
Evaluation Studies of Robotic Rollators by the User Perspective: A Systematic Review.
Werner, Christian; Ullrich, Phoebe; Geravand, Milad; Peer, Angelika; Hauer, Klaus
2016-01-01
Robotic rollators enhance the basic functions of established devices by technically advanced physical, cognitive, or sensory support to increase autonomy in persons with severe impairment. In the evaluation of such ambient assisted living solutions, both the technical and user perspectives are important to prove usability, effectiveness and safety, and to ensure adequate device application. The aim of this systematic review is to summarize the methodology of studies evaluating robotic rollators with focus on the user perspective and to give recommendations for future evaluation studies. A systematic literature search up to December 31, 2014, was conducted based on the Cochrane Review methodology using the electronic databases PubMed and IEEE Xplore. Articles were selected according to the following inclusion criteria: evaluation studies of robotic rollators documenting human-robot interaction, no case reports, published in English language. Twenty-eight studies were identified that met the predefined inclusion criteria. Large heterogeneity in the definitions of the target user group, study populations, study designs and assessment methods was found across the included studies. No generic methodology to evaluate robotic rollators could be identified. We found major methodological shortcomings related to insufficient sample descriptions and sample sizes, and lack of appropriate, standardized and validated assessment methods. Long-term use in habitual environment was also not evaluated. Apart from the heterogeneity, methodological deficits in most of the identified studies became apparent. Recommendations for future evaluation studies include: clear definition of target user group, adequate selection of subjects, inclusion of other assistive mobility devices for comparison, evaluation of the habitual use of advanced prototypes, adequate assessment strategy with established, standardized and validated methods, and statistical analysis of study results. Assessment strategies may additionally focus on specific functionalities of the robotic rollators allowing an individually tailored assessment of innovative features to document their added value. © 2016 S. Karger AG, Basel.
Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.
Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R
1996-01-01
Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
DEVELOPMENT OF A SUB-SLAB AIR SAMPLING PROTOCOL TO SUPPORT ASSESSMENT OF VAPOR INTRUSION
The primary purpose of this research effort is to develop a methodology for sub-slab sampling to support the EPA guidance and vapor intrusion investigations after vapor intrusion has been established at a site. Methodologies for sub-slab air sampling are currently lacking in ref...
Methodological Choices in Rating Speech Samples
ERIC Educational Resources Information Center
O'Brien, Mary Grantham
2016-01-01
Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…
NASA Astrophysics Data System (ADS)
Kirkham, R.; Olsen, K.; Hayes, J. C.; Emer, D. F.
2013-12-01
Underground nuclear tests may be first detected by seismic or air samplers operated by the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization). After initial detection of a suspicious event, member nations may call for an On-Site Inspection (OSI) that in part, will sample for localized releases of radioactive noble gases and particles. Although much of the commercially available equipment and methods used for surface and subsurface environmental sampling of gases can be used for an OSI scenario, on-site sampling conditions, required sampling volumes and establishment of background concentrations of noble gases require development of specialized methodologies. To facilitate development of sampling equipment and methodologies that address OSI sampling volume and detection objectives, and to collect information required for model development, a field test site was created at a former underground nuclear explosion site located in welded volcanic tuff. A mixture of SF-6, Xe127 and Ar37 was metered into 4400 m3 of air as it was injected into the top region of the UNE cavity. These tracers were expected to move towards the surface primarily in response to barometric pumping or through delayed cavity pressurization (accelerated transport to minimize source decay time). Sampling approaches compared during the field exercise included sampling at the soil surface, inside surface fractures, and at soil vapor extraction points at depths down to 2 m. Effectiveness of various sampling approaches and the results of tracer gas measurements will be presented.
Leighton, Caroline; Botto, Alberto; Silva, Jaime R; Jiménez, Juan Pablo; Luyten, Patrick
2017-01-01
Research on the potential role of gene-environment interactions (GxE) in explaining vulnerability to psychopathology in humans has witnessed a shift from a diathesis-stress perspective to differential susceptibility approaches. This paper critically reviews methodological issues and trends in this body of research. Databases were screened for studies of GxE in the prediction of personality traits, behavior, and mental health disorders in humans published between January 2002 and January 2015. In total, 315 papers were included. Results showed that 34 candidate genes have been included in GxE studies. Independent of the type of environment studied (early or recent life events, positive or negative environments), about 67-83% of studies have reported significant GxE interactions, which is consistent with a social susceptibility model. The percentage of positive results does not seem to differ depending on the gene studied, although publication bias might be involved. However, the number of positive findings differs depending on the population studied (i.e., young adults vs. older adults). Methodological considerations limit the ability to draw strong conclusions, particularly as almost 90% ( n = 283/315) of published papers are based on samples from North America and Europe, and about 70% of published studies (219/315) are based on samples that were also used in other reports. At the same time, there are clear indications of methodological improvements over time, as is shown by a significant increase in longitudinal and experimental studies as well as in improved minimum genotyping. Recommendations for future research, such as minimum quality assessment of genes and environmental factors, specifying theoretical models guiding the study, and taking into account of cultural, ethnic, and lifetime perspectives, are formulated.
NASA Astrophysics Data System (ADS)
Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE
2017-01-01
We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.
Funding source and the quality of reports of chronic wounds trials: 2004 to 2011
2014-01-01
Background Critical commentaries suggest that wound care randomised controlled trials (RCTs) are often poorly reported with many methodological flaws. Furthermore, interventions in chronic wounds, rather than being drugs, are often medical devices for which there are no requirements for RCTs to bring products to market. RCTs in wounds trials therefore potentially represent a form of marketing. This study presents a methodological overview of chronic wound trials published between 2004 and 2011 and investigates the influence of industry funding on methodological quality. Methods A systematic search for RCTs for the treatment of chronic wounds published in the English language between 2004 and 2011 (inclusive) in the Cochrane Wounds Group Specialised Register of Trials was carried out. Data were extracted on aspects of trial design, conduct and quality including sample size, duration of follow-up, specification of a primary outcome, use of surrogate outcomes, and risks of bias. In addition, the prevalence of industry funding was assessed and its influence on the above aspects of trial design, conduct and quality was assessed. Results A total of 167 RCTs met our inclusion criteria. We found chronic wound trials often have short durations of follow-up (median 12 weeks), small sample sizes (median 63), fail to define a primary outcome in 41% of cases, and those that do define a primary outcome, use surrogate measures of healing in 40% of cases. Only 40% of trials used appropriate methods of randomisation, 25% concealed allocation and 34% blinded outcome assessors. Of the included trials, 41% were wholly or partially funded by industry, 33% declared non-commercial funding and 26% did not report a funding source. Industry funding was not statistically significantly associated with any measure of methodological quality, though this analysis was probably underpowered. Conclusions This overview confirms concerns raised about the methodological quality of RCTs in wound care and illustrates that greater efforts must be made to follow international standards for conducting and reporting RCTs. There is currently minimal evidence of an influence of industry funding on methodological quality although analyses had limited power and funding source was not reported for a quarter of studies. PMID:24422753
Funding source and the quality of reports of chronic wounds trials: 2004 to 2011.
Hodgson, Robert; Allen, Richard; Broderick, Ellen; Bland, J Martin; Dumville, Jo C; Ashby, Rebecca; Bell-Syer, Sally; Foxlee, Ruth; Hall, Jill; Lamb, Karen; Madden, Mary; O'Meara, Susan; Stubbs, Nikki; Cullum, Nicky
2014-01-14
Critical commentaries suggest that wound care randomised controlled trials (RCTs) are often poorly reported with many methodological flaws. Furthermore, interventions in chronic wounds, rather than being drugs, are often medical devices for which there are no requirements for RCTs to bring products to market. RCTs in wounds trials therefore potentially represent a form of marketing. This study presents a methodological overview of chronic wound trials published between 2004 and 2011 and investigates the influence of industry funding on methodological quality. A systematic search for RCTs for the treatment of chronic wounds published in the English language between 2004 and 2011 (inclusive) in the Cochrane Wounds Group Specialised Register of Trials was carried out.Data were extracted on aspects of trial design, conduct and quality including sample size, duration of follow-up, specification of a primary outcome, use of surrogate outcomes, and risks of bias. In addition, the prevalence of industry funding was assessed and its influence on the above aspects of trial design, conduct and quality was assessed. A total of 167 RCTs met our inclusion criteria. We found chronic wound trials often have short durations of follow-up (median 12 weeks), small sample sizes (median 63), fail to define a primary outcome in 41% of cases, and those that do define a primary outcome, use surrogate measures of healing in 40% of cases. Only 40% of trials used appropriate methods of randomisation, 25% concealed allocation and 34% blinded outcome assessors. Of the included trials, 41% were wholly or partially funded by industry, 33% declared non-commercial funding and 26% did not report a funding source. Industry funding was not statistically significantly associated with any measure of methodological quality, though this analysis was probably underpowered. This overview confirms concerns raised about the methodological quality of RCTs in wound care and illustrates that greater efforts must be made to follow international standards for conducting and reporting RCTs. There is currently minimal evidence of an influence of industry funding on methodological quality although analyses had limited power and funding source was not reported for a quarter of studies.
Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J
2016-11-01
In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate allocation generation, 1 (12.5%) trials reported adequate allocation concealment, 2 (25.0%) trials reported adequate blinding and 5 (62.5%) trials reported handling of dropouts. There were statistical differences as for sample size calculation and handling of dropouts between papers from Mainland China and OTSR (P<0.05). The findings of this study show that the methodological reporting quality of RCTs in seven core orthopaedic journals from the Mainland China is far from satisfaction and it needs to further improve to keep up with the standards of the CONSORT statement. Level III case control. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Qualitative case study methodology in nursing research: an integrative review.
Anthony, Susan; Jack, Susan
2009-06-01
This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.
Ugena, L.; Moncayo, S.; Manzoor, S.; Rosales, D.
2016-01-01
The detection of adulteration of fuels and its use in criminal scenes like arson has a high interest in forensic investigations. In this work, a method based on gas chromatography (GC) and neural networks (NN) has been developed and applied to the identification and discrimination of brands of fuels such as gasoline and diesel without the necessity to determine the composition of the samples. The study included five main brands of fuels from Spain, collected from fifteen different local petrol stations. The methodology allowed the identification of the gasoline and diesel brands with a high accuracy close to 100%, without any false positives or false negatives. A success rate of three blind samples was obtained as 73.3%, 80%, and 100%, respectively. The results obtained demonstrate the potential of this methodology to help in resolving criminal situations. PMID:27375919
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei
2018-02-01
To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials. At the same time, in order to improve the reporting quality of randomized controlled trials, CONSORT standards should be adopted in the preparation of research reports and submissions. Copyright© by the Chinese Pharmaceutical Association.
Northern Marshall Islands radiological survey: sampling and analysis summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robison, W.L.; Conrado, C.L.; Eagle, R.J.
1981-07-23
A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islandsmore » and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for /sup 241/Am, 6569 for /sup 137/Cs, 4535 for /sup 239 +240/Pu, 4431 for /sup 90/Sr, 1146 for /sup 238/Pu, 269 for /sup 241/Pu, and 114 each for /sup 239/Pu and /sup 240/Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included.« less
Grey literature in meta-analyses.
Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J
2003-01-01
In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.
Does Maltreatment Beget Maltreatment? A Systematic Review of the Intergenerational Literature
Thornberry, Terence P.; Knight, Kelly E.; Lovegrove, Peter J.
2014-01-01
In this paper, we critically review the literature testing the cycle of maltreatment hypothesis which posits continuity in maltreatment across adjacent generations. That is, we examine whether a history of maltreatment victimization is a significant risk factor for the later perpetration of maltreatment. We begin by establishing 11 methodological criteria that studies testing this hypothesis should meet. They include such basic standards as using representative samples, valid and reliable measures, prospective designs, and different reporters for each generation. We identify 47 studies that investigated this issue and then evaluate them with regard to the 11 methodological criteria. Overall, most of these studies report findings consistent with the cycle of maltreatment hypothesis. Unfortunately, at the same time, few of them satisfy the basic methodological criteria that we established; indeed, even the stronger studies in this area only meet about half of them. Moreover, the methodologically stronger studies present mixed support for the hypothesis. As a result, the positive association often reported in the literature appears to be based largely on the methodologically weaker designs. Based on our systematic methodological review, we conclude that this small and methodologically weak body of literature does not provide a definitive test of the cycle of maltreatment hypothesis. We conclude that it is imperative to develop more robust and methodologically adequate assessments of this hypothesis to more accurately inform the development of prevention and treatment programs. PMID:22673145
Martinez, Marie-José; Durand, Benoit; Calavas, Didier; Ducrot, Christian
2010-06-01
Demonstrating disease freedom is becoming important in different fields including animal disease control. Most methods consider sampling only from a homogeneous population in which each animal has the same probability of becoming infected. In this paper, we propose a new methodology to calculate the probability of detecting the disease if it is present in a heterogeneous population of small size with potentially different risk groups, differences in risk being defined using relative risks. To calculate this probability, for each possible arrangement of the infected animals in the different groups, the probability that all the animals tested are test-negative given this arrangement is multiplied by the probability that this arrangement occurs. The probability formula is developed using the assumption of a perfect test and hypergeometric sampling for finite small size populations. The methodology is applied to scrapie, a disease affecting small ruminants and characterized in sheep by a strong genetic susceptibility defining different risk groups. It illustrates that the genotypes of the tested animals influence heavily the confidence level of detecting scrapie. The results present the statistical power for substantiating disease freedom in a small heterogeneous population as a function of the design prevalence, the structure of the sample tested, the structure of the herd and the associated relative risks. (c) 2010 Elsevier B.V. All rights reserved.
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
Code of Federal Regulations, 2010 CFR
2010-10-01
... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS... primary review utilizing probability sampling methodologies. Usually, the chosen methodology will be simple random sampling, but other probability samples may be utilized, when necessary and appropriate. (3...
Methodological Challenges in Physical Activity Research with Older Adults
Chase, Jo-Ana D.
2015-01-01
The aging adult population is growing, as well as the incidence of chronic illness among older adults. Physical activity has been demonstrated in the literature to be a beneficial component of self-management for chronic illnesses commonly found in the older adult population. Health sciences research seeks to develop new knowledge, practices, and policies that may benefit older adults’ management of chronic illness and quality of life. However, research with the older adult population, though beneficial, includes potential methodological challenges specific to this age group. This article discusses common methodological issues in research among older adults, with a focus on physical activity intervention studies. Awareness and understanding of these issues may facilitate future development of research studies devoted to the aging adult population, through appropriate modification and tailoring of sampling techniques, intervention development, and data measures and collection. PMID:21821726
Grabitz, Clara R; Button, Katherine S; Munafò, Marcus R; Newbury, Dianne F; Pernet, Cyril R; Thompson, Paul A; Bishop, Dorothy V M
2018-01-01
Genetics and neuroscience are two areas of science that pose particular methodological problems because they involve detecting weak signals (i.e., small effects) in noisy data. In recent years, increasing numbers of studies have attempted to bridge these disciplines by looking for genetic factors associated with individual differences in behavior, cognition, and brain structure or function. However, different methodological approaches to guarding against false positives have evolved in the two disciplines. To explore methodological issues affecting neurogenetic studies, we conducted an in-depth analysis of 30 consecutive articles in 12 top neuroscience journals that reported on genetic associations in nonclinical human samples. It was often difficult to estimate effect sizes in neuroimaging paradigms. Where effect sizes could be calculated, the studies reporting the largest effect sizes tended to have two features: (i) they had the smallest samples and were generally underpowered to detect genetic effects, and (ii) they did not fully correct for multiple comparisons. Furthermore, only a minority of studies used statistical methods for multiple comparisons that took into account correlations between phenotypes or genotypes, and only nine studies included a replication sample or explicitly set out to replicate a prior finding. Finally, presentation of methodological information was not standardized and was often distributed across Methods sections and Supplementary Material, making it challenging to assemble basic information from many studies. Space limits imposed by journals could mean that highly complex statistical methods were described in only a superficial fashion. In summary, methods that have become standard in the genetics literature-stringent statistical standards, use of large samples, and replication of findings-are not always adopted when behavioral, cognitive, or neuroimaging phenotypes are used, leading to an increased risk of false-positive findings. Studies need to correct not just for the number of phenotypes collected but also for the number of genotypes examined, genetic models tested, and subsamples investigated. The field would benefit from more widespread use of methods that take into account correlations between the factors corrected for, such as spectral decomposition, or permutation approaches. Replication should become standard practice; this, together with the need for larger sample sizes, will entail greater emphasis on collaboration between research groups. We conclude with some specific suggestions for standardized reporting in this area.
ERIC Educational Resources Information Center
Alders, Amanda
2011-01-01
This article describes the methodology, data analysis, and results for a pilot study investigating perceived self-efficacy of cognitive performance among Latino American elderly. The sample included 24 Latino American elderly. A 12-week quasi-experimental design was utilized. Participants were provided with weekly 2-hr art education sessions…
A Mapping of Participation Rates in Junior Sport in the Australian Capital Territory.
ERIC Educational Resources Information Center
Clough, J. R.; Traill, R. D.
This executive summary discusses the methodology and findings of a survey of participation in sport by school-age young people in the Australian Capital Territory school system. The sample included 525 males and 523 females in grades kindergarten to 12. The survey assessed participation in 25 sports in 4 different contexts (playing sport in…
Successful Principalship in Norway: Sustainable Ethos and Incremental Changes?
ERIC Educational Resources Information Center
Moller, Jorunn; Vedoy, Gunn; Presthus, Anne Marie; Skedsmo, Guri
2009-01-01
Purpose: The purpose of this paper is to explore whether and how success has been sustained over time in schools which were identified as being successful five years ago. Design/methodology/approach: Three schools were selected for a revisit, and the sample included two combined schools (grade 1-10) and one upper secondary school (grade 11-13). In…
Prevalence of Childhood Sexual Abuse among Incarcerated Males in County Jail
ERIC Educational Resources Information Center
Johnson, Regina J.; Ross, Michael W.; Taylor, Wendell C.; Williams, Mark L.; Carvajal, Raul I.; Peters, Ronald J.
2006-01-01
Objective: The current study examined the prevalence and characteristics of childhood sexual abuse in a jailed-based population. Methodology: A retrospective, self-reported survey was administered over an 8-week period to a random sample of 100 men who were incarcerated in a county jail in Southeastern Texas. The survey included questions about…
ERIC Educational Resources Information Center
Stokeld, Cheryl L.
This paper reviews literature on the adult outcomes for children diagnosed with attention deficit/hyperactivity disorder (AD/HD). It critiques methodological issues, including diagnostic definitions, research designs, sample characteristics, and assessment instruments. It examines the relationship of AD/HD to a variety of adult disorders and…
ERIC Educational Resources Information Center
Davis, G.; O'Callaghan, F.; Knox, K.
2009-01-01
Purpose: The purpose of this paper is seek to characterise sustainable attitudes and behaviours (including recycling and waste minimisation, energy efficiency, water conservation and "green" purchasing) amongst non-academic staff within Griffith University, Queensland. Design/methodology/approach: For this study, the attitudes and…
One-to-One Computing and Student Achievement in Ohio High Schools
ERIC Educational Resources Information Center
Williams, Nancy L.; Larwin, Karen H.
2016-01-01
This study explores the impact of one-to-one computing on student achievement in Ohio high schools as measured by performance on the Ohio Graduation Test (OGT). The sample included 24 treatment schools that were individually paired with a similar control school. An interrupted time series methodology was deployed to examine OGT data over a period…
Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus
2016-05-11
Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.
Evaluation of ridesharing programs in Michigan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulp, G.; Tsao, H.J.; Webber, R.E.
1982-10-01
The design, implementation, and results of a carpool and vanpool evaluation are described. Objectives of the evaluation were: to develop credible estimates of the energy savings attributable to the ridesharing program, to provide information for improving the performance of the ridesharing program, and to add to a general understanding of the ridesharing process. Previous evaluation work is critiqued and the research methodology adopted for this study is discussed. The ridesharing program in Michigan is described and the basis for selecting Michigan as the evaluation site is discussed. The evaluation methodology is presented, including research design, sampling procedure, data collection, andmore » data validation. Evaluation results are analyzed. (LEW)« less
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
Updated methodology for nuclear magnetic resonance characterization of shales
NASA Astrophysics Data System (ADS)
Washburn, Kathryn E.; Birdwell, Justin E.
2013-08-01
Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world's energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1-T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.
Orellano-Colón, Elsa M.; Jutai, Jeffrey; Santiago, Angélica; Torres, Víctor; Benítez, Keyla; Torres, Mayra
2016-01-01
(1) Knowledge about the assistive technology (AT) needs and psychosocial impact of AT in different populations is needed because the adoption, retention, or abandonment of AT may be influenced by the psychosocial impact that AT has on its users. The aims of this study were to: (a) identify the AT needs of a sample of Hispanic older adults with functional limitations, (b) describe the psychosocial impact of these technologies on the sample’s quality of life, and (c) describe the methodological challenges in using the Puerto Rican version of the Psychosocial Impact of Assistive Device Scale (PR-PIADS) with a Hispanic sample. (2) Methods: This study used a cross-sectional design conducted with a sample of 60 participants. Data was collected using the Assistive Technology Card Assessment Questionnaire (ATCAQ) and the PR-PIADS. Data analyses included descriptive statistics and bivariate analysis. (3) Results: The sample’s most frequently reported needs for AT devices were in the areas of cooking, home tasks, and home safety activities. The sample reported a positive impact of AT use in their quality of life. Several methodological challenges of the PIADS were identified. (4) Conclusions: The sample has unmet needs for using AT devices to overcome difficulties in daily living activities. PMID:27695688
Vergani, Stefano; Korsunsky, Ilya; Mazzarello, Andrea Nicola; Ferrer, Gerardo; Chiorazzi, Nicholas; Bagnara, Davide
2017-01-01
Efficient and accurate high-throughput DNA sequencing of the adaptive immune receptor repertoire (AIRR) is necessary to study immune diversity in healthy subjects and disease-related conditions. The high complexity and diversity of the AIRR coupled with the limited amount of starting material, which can compromise identification of the full biological diversity makes such sequencing particularly challenging. AIRR sequencing protocols often fail to fully capture the sampled AIRR diversity, especially for samples containing restricted numbers of B lymphocytes. Here, we describe a library preparation method for immunoglobulin sequencing that results in an exhaustive full-length repertoire where virtually every sampled B-cell is sequenced. This maximizes the likelihood of identifying and quantifying the entire IGHV-D-J repertoire of a sample, including the detection of rearrangements present in only one cell in the starting population. The methodology establishes the importance of circumventing genetic material dilution in the preamplification phases and incorporates the use of certain described concepts: (1) balancing the starting material amount and depth of sequencing, (2) avoiding IGHV gene-specific amplification, and (3) using Unique Molecular Identifier. Together, this methodology is highly efficient, in particular for detecting rare rearrangements in the sampled population and when only a limited amount of starting material is available.
Updated methodology for nuclear magnetic resonance characterization of shales
Washburn, Kathryn E.; Birdwell, Justin E.
2013-01-01
Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world’s energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1–T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.
Mushkudiani, Nino A; Hukkelhoven, Chantal W P M; Hernández, Adrián V; Murray, Gordon D; Choi, Sung C; Maas, Andrew I R; Steyerberg, Ewout W
2008-04-01
To describe the modeling techniques used for early prediction of outcome in traumatic brain injury (TBI) and to identify aspects for potential improvements. We reviewed key methodological aspects of studies published between 1970 and 2005 that proposed a prognostic model for the Glasgow Outcome Scale of TBI based on admission data. We included 31 papers. Twenty-four were single-center studies, and 22 reported on fewer than 500 patients. The median of the number of initially considered predictors was eight, and on average five of these were selected for the prognostic model, generally including age, Glasgow Coma Score (or only motor score), and pupillary reactivity. The most common statistical technique was logistic regression with stepwise selection of predictors. Model performance was often quantified by accuracy rate rather than by more appropriate measures such as the area under the receiver-operating characteristic curve. Model validity was addressed in 15 studies, but mostly used a simple split-sample approach, and external validation was performed in only four studies. Although most models agree on the three most important predictors, many were developed on small sample sizes within single centers and hence lack generalizability. Modeling strategies have to be improved, and include external validation.
Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge
2017-05-05
We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.
Kennedy, Gordon J; Afeworki, Mobae; Calabro, David C; Chase, Clarence E; Smiley, Randolph J
2004-06-01
Distinct hydrogen species are present in important inorganic solids such as zeolites, silicoaluminophosphates (SAPOs), mesoporous materials, amorphous silicas, and aluminas. These H species include hydrogens associated with acidic sites such as Al(OH)Si, non-framework aluminum sites, silanols, and surface functionalities. Direct and quantitative methodology to identify, measure, and monitor these hydrogen species are key to monitoring catalyst activity, optimizing synthesis conditions, tracking post-synthesis structural modifications, and in the preparation of novel catalytic materials. Many workers have developed several techniques to address these issues, including 1H MAS NMR (magic-angle spinning nuclear magnetic resonance). 1H MAS NMR offers many potential advantages over other techniques, but care is needed in recognizing experimental limitations and developing sample handling and NMR methodology to obtain quantitatively reliable data. A simplified approach is described that permits vacuum dehydration of multiple samples simultaneously and directly in the MAS rotor without the need for epoxy, flame sealing, or extensive glovebox use. We have found that careful optimization of important NMR conditions, such as magnetic field homogeneity and magic angle setting are necessary to acquire quantitative, high-resolution spectra that accurately measure the concentrations of the different hydrogen species present. Details of this 1H MAS NMR methodology with representative applications to zeolites, SAPOs, M41S, and silicas as a function of synthesis conditions and post-synthesis treatments (i.e., steaming, thermal dehydroxylation, and functionalization) are presented.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
The use of microtomography in structural geology: A new methodology to analyse fault faces
NASA Astrophysics Data System (ADS)
Jacques, Patricia D.; Nummer, Alexis Rosa; Heck, Richard J.; Machado, Rômulo
2014-09-01
This paper describes a new methodology to kinematically analyze faults in microscale dimensions (voxel size = 40 μm), using images obtained by X-ray computed microtomography (μCT). The equipment used is a GE MS8x-130 scanner. It was developed using rocks samples from Santa Catarina State, Brazil, and constructing micro Digital Elevation Models (μDEMs) for the fault surface, for analysing microscale brittle structures including striations, roughness and steps. Shaded relief images were created for the μDEMs, which enabled the generation of profiles to classify the secondary structures associated with the main fault surface. In the case of a sample with mineral growth that covers the fault surface, it is possible to detect the kinematic geometry even with the mineral cover. This technique proved to be useful for determining the sense of movement of faults, especially when it is not possible to determine striations in macro or microscopic analysis. When the sample has mineral deposit on the surface (mineral cover) this technique allows a relative chronology and geometric characterization between the faults with and without covering.
Zhang, Liding; Wei, Qiujiang; Han, Qinqin; Chen, Qiang; Tai, Wenlin; Zhang, Jinyang; Song, Yuzhu; Xia, Xueshan
2018-01-01
Shigella is an important human food-borne zoonosis bacterial pathogen, and can cause clinically severe diarrhea. There is an urgent need to develop a specific, sensitive, and rapid methodology for detection of this pathogen. In this study, loop-mediated isothermal amplification (LAMP) combined with magnetic immunocapture assay (IC-LAMP) was first developed for the detection of Shigella in pure culture, artificial milk, and clinical stool samples. This method exhibited a detection limit of 8.7 CFU/mL. Compared with polymerase chain reaction, IC-LAMP is sensitive, specific, and reliable for monitoring Shigella. Additionally, IC-LAMP is more convenient, efficient, and rapid than ordinary LAMP, as it is more efficiently enriches pathogen cells without extraction of genomic DNA. Under isothermal conditions, the amplification curves and the green fluorescence were detected within 30 min in the presence of genomic DNA template. The overall analysis time was approximately 1 h, including the enrichment and lysis of the bacterial cells, a significantly short detection time. Therefore, the IC-LAMP methodology described here is potentially useful for the efficient detection of Shigella in various samples. PMID:29467730
Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin
2017-06-01
This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of parameters of errors beyond sample selection in the current and previous data collections indicates that trend estimates and their interpretation requires outstanding attention and in some cases even correction procedures might become necessary.
Sampling in freshwater environments: suspended particle traps and variability in the final data.
Barbizzi, Sabrina; Pati, Alessandra
2008-11-01
This paper reports one practical method to estimate the measurement uncertainty including sampling, derived by the approach implemented by Ramsey for soil investigations. The methodology has been applied to estimate the measurements uncertainty (sampling and analyses) of (137)Cs activity concentration (Bq kg(-1)) and total carbon content (%) in suspended particle sampling in a freshwater ecosystem. Uncertainty estimates for between locations, sampling and analysis components have been evaluated. For the considered measurands, the relative expanded measurement uncertainties are 12.3% for (137)Cs and 4.5% for total carbon. For (137)Cs, the measurement (sampling+analysis) variance gives the major contribution to the total variance, while for total carbon the spatial variance is the dominant contributor to the total variance. The limitations and advantages of this basic method are discussed.
Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen
2016-04-07
Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-02-25
It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal.
Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.
Fok, Carlotta Ching Ting; Henry, David; Allen, James
2015-10-01
Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.
Test-Retest Reliability of Pediatric Heart Rate Variability: A Meta-Analysis.
Weiner, Oren M; McGrath, Jennifer J
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970-December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies ( N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher's Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5-18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies ( Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies ( Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies ( Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies ( Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed.
Test-Retest Reliability of Pediatric Heart Rate Variability
Weiner, Oren M.; McGrath, Jennifer J.
2017-01-01
Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970–December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies (N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher’s Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5–18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies (Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies (Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies (Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies (Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed. PMID:29307951
López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl
2018-08-15
The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.
A meta-analysis of family accommodation and OCD symptom severity.
Wu, Monica S; McGuire, Joseph F; Martino, Charitie; Phares, Vicky; Selles, Robert R; Storch, Eric A
2016-04-01
Family accommodation in obsessive-compulsive disorder (OCD) is characterized by myriad behaviors, such as modifying family routines, facilitating avoidance, and engaging in compulsions to reduce obsessional distress. It has been linked to various deleterious outcomes including increased functional impairment and poorer treatment response for OCD. Although extant literature suggests a linear relationship between family accommodation and OCD symptom severity, the magnitude and statistical significance of this association has been inconsistent across studies, indicating that moderators may be influencing this relationship. The present study examined this relationship using meta-analytic techniques, and investigated sample-dependent (age, gender, comorbid anxiety/mood disorders) and methodological (administration method and number of items used in family accommodation measure, informant type, sample size, publication year) moderators. Forty-one studies were included in the present meta-analysis, and the overall effect size (ES) for the correlation between family accommodation and OCD symptom severity was moderate (r=.42). Moderator analyses revealed that the number of items on the family accommodation scale moderated the ES. No other sample-dependent or methodological characteristics emerged as moderators. In addition to being the first systematic examination of family accommodation moderators, these results highlight the moderate relationship between family accommodation and OCD severity that is influenced by measurement scales. Findings may be used to guide clinical care and inform future investigations by providing a more nuanced understanding of family accommodation in OCD. Copyright © 2016 Elsevier Ltd. All rights reserved.
Novel methodology to isolate microplastics from vegetal-rich samples.
Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T
2018-04-01
Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Wolf, Patrick J.
2012-01-01
This report contains a summary of the findings from the various topical reports that comprise the author's comprehensive longitudinal study. As a summary, it does not include extensive details regarding the study samples and scientific methodologies employed in those topical studies. The research revealed a pattern of school choice results that…
Morales Guerrero, Josefina C; García Zepeda, Rodrigo A; Flores Ruvalcaba, Edgar; Martínez Michel, Lorelei
2012-09-01
We evaluated the two methods accepted by the Mexican norm for the determination of nitritesin infant meat-based food with vegetables. We determined the content of nitrites in the infant food, raw materials as well as products from the intermediate stages of production. A reagent blank and a reference sample were included at each analytical run. In addition, we determined the sensitivity, recovery percentage and accuracy of each methodology. Infant food results indicated an important difference in the nitrite content determined under each methodology, due to the persistent presence of turbidity in the extracts. Different treatments were proposed to eliminate the turbidity, but these only managed to reduce it. The turbidity was attributed to carbohydrates which disclosed concentration exhibit a wide dispersion and were below the quantifiable limit under both methodologies; therefore it is not recommended to apply these techniques with food suspected to contain traces of nitrites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.
We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms onmore » the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.« less
Analysis of Material Sample Heated by Impinging Hot Hydrogen Jet in a Non-Nuclear Tester
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Foote, John; Litchford, Ron
2006-01-01
A computational conjugate heat transfer methodology was developed and anchored with data obtained from a hot-hydrogen jet heated, non-nuclear materials tester, as a first step towards developing an efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective and thermal radiative, and conjugate heat transfers. Predicted hot hydrogen jet and material surface temperatures were compared with those of measurement. Predicted solid temperatures were compared with those obtained with a standard heat transfer code. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.
Post-traumatic Stress Symptoms in Post-ICU Family Members: Review and Methodological Challenges
Petrinec, Amy B.; Daly, Barbara J.
2018-01-01
Family members of intensive care unit (ICU) patients are at risk for symptoms of post-traumatic stress disorder (PTSD) following ICU discharge. The aim of this systematic review is to examine the current literature regarding post-ICU family PTSD symptoms with an emphasis on methodological issues in conducting research on this challenging phenomenon. An extensive review of the literature was performed confining the search to English language studies reporting PTSD symptoms in adult family members of adult ICU patients. Ten studies were identified for review published from 2004–2012. Findings demonstrate a significant prevalence of family PTSD symptoms in the months following ICU hospitalization. However, there are several methodological challenges to the interpretation of existing studies and to the conduct of future research including differences in sampling, identification of risk factors and covariates of PTSD, and lack of consensus regarding the most appropriate PTSD symptom measurement tools and timing. PMID:25061017
Real time simulation of computer-assisted sequencing of terminal area operations
NASA Technical Reports Server (NTRS)
Dear, R. G.
1981-01-01
A simulation was developed to investigate the utilization of computer assisted decision making for the task of sequencing and scheduling aircraft in a high density terminal area. The simulation incorporates a decision methodology termed Constrained Position Shifting. This methodology accounts for aircraft velocity profiles, routes, and weight classes in dynamically sequencing and scheduling arriving aircraft. A sample demonstration of Constrained Position Shifting is presented where six aircraft types (including both light and heavy aircraft) are sequenced to land at Denver's Stapleton International Airport. A graphical display is utilized and Constrained Position Shifting with a maximum shift of four positions (rearward or forward) is compared to first come, first serve with respect to arrival at the runway. The implementation of computer assisted sequencing and scheduling methodologies is investigated. A time based control concept will be required and design considerations for such a system are discussed.
Krauth, David; Woodruff, Tracey J.
2013-01-01
Background: Results from animal toxicology studies are critical to evaluating the potential harm from exposure to environmental chemicals or the safety of drugs prior to human testing. However, there is significant debate about how to evaluate the methodology and potential biases of the animal studies. There is no agreed-upon approach, and a systematic evaluation of current best practices is lacking. Objective: We performed a systematic review to identify and evaluate instruments for assessing the risk of bias and/or other methodological criteria of animal studies. Method: We searched Medline (January 1966–November 2011) to identify all relevant articles. We extracted data on risk of bias criteria (e.g., randomization, blinding, allocation concealment) and other study design features included in each assessment instrument. Discussion: Thirty distinct instruments were identified, with the total number of assessed risk of bias, methodological, and/or reporting criteria ranging from 2 to 25. The most common criteria assessed were randomization (25/30, 83%), investigator blinding (23/30, 77%), and sample size calculation (18/30, 60%). In general, authors failed to empirically justify why these or other criteria were included. Nearly all (28/30, 93%) of the instruments have not been rigorously tested for validity or reliability. Conclusion: Our review highlights a number of risk of bias assessment criteria that have been empirically tested for animal research, including randomization, concealment of allocation, blinding, and accounting for all animals. In addition, there is a need for empirically testing additional methodological criteria and assessing the validity and reliability of a standard risk of bias assessment instrument. Citation: Krauth D, Woodruff TJ, Bero L. 2013. Instruments for assessing risk of bias and other methodological criteria of published animal studies: a systematic review. Environ Health Perspect 121:985–992 (2013); http://dx.doi.org/10.1289/ehp.1206389 PMID:23771496
Catchment-wide impacts on water quality: the use of 'snapshot' sampling during stable flow
NASA Astrophysics Data System (ADS)
Grayson, R. B.; Gippel, C. J.; Finlayson, B. L.; Hart, B. T.
1997-12-01
Water quality is usually monitored on a regular basis at only a small number of locations in a catchment, generally focused at the catchment outlet. This integrates the effect of all the point and non-point source processes occurring throughout the catchment. However, effective catchment management requires data which identify major sources and processes. As part of a wider study aimed at providing technical information for the development of integrated catchment management plans for a 5000 km 2 catchment in south eastern Australia, a 'snapshot' of water quality was undertaken during stable summer flow conditions. These low flow conditions exist for long periods so water quality at these flow levels is an important constraint on the health of in-stream biological communities. Over a 4 day period, a study of the low flow water quality characteristics throughout the Latrobe River catchment was undertaken. Sixty-four sites were chosen to enable a longitudinal profile of water quality to be established. All tributary junctions and sites along major tributaries, as well as all major industrial inputs were included. Samples were analysed for a range of parameters including total suspended solids concentration, pH, dissolved oxygen, electrical conductivity, turbidity, flow rate and water temperature. Filtered and unfiltered samples were taken from 27 sites along the main stream and tributary confluences for analysis of total N, NH 4, oxidised N, total P and dissolved reactive P concentrations. The data are used to illustrate the utility of this sampling methodology for establishing specific sources and estimating non-point source loads of phosphorous, total suspended solids and total dissolved solids. The methodology enabled several new insights into system behaviour including quantification of unknown point discharges, identification of key in-stream sources of suspended material and the extent to which biological activity (phytoplankton growth) affects water quality. The costs and benefits of the sampling exercise are reviewed.
Volatile organic compounds: sampling methods and their worldwide profile in ambient air.
Kumar, Anuj; Víden, Ivan
2007-08-01
The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.
Charmaraman, Linda; Woo, Meghan; Quach, Ashley; Erkut, Sumru
2014-07-01
The U.S. Census shows that the racial-ethnic makeup of over 9 million people (2.9% of the total population) who self-identified as multiracial is extremely diverse. Each multiracial subgroup has unique social and political histories that may lead to distinct societal perceptions, economic situations, and health outcomes. Despite the increasing academic and media interest in multiracial individuals, there are methodological and definitional challenges in studying the population, resulting in conflicting representations in the literature. This content and methods review of articles on multiracial populations provides a comprehensive understanding of which multiracial populations have been included in research and how they have been studied, both to recognize emerging research and to identify gaps for guiding future research on this complex but increasingly visible population. We examine 125 U.S.-based peer-reviewed journal articles published over the past 20 years (1990 to 2009) containing 133 separate studies focused on multiracial individuals, primarily from the fields of psychology, sociology, social work, education, and public health. Findings include (a) descriptive data regarding the sampling strategies, methodologies, and demographic characteristics of studies, including which multiracial subgroups are most studied, gender, age range, region of country, and socioeconomic status; (b) major thematic trends in research topics concerning multiracial populations; and (c) implications and recommendations for future studies.
Abras, Alba; Ballart, Cristina; Llovet, Teresa; Roig, Carme; Gutiérrez, Cristina; Tebar, Silvia; Berenguer, Pere; Pinazo, María-Jesús; Posada, Elizabeth; Gascón, Joaquim; Schijman, Alejandro G; Gállego, Montserrat; Muñoz, Carmen
2018-01-01
Polymerase chain reaction (PCR) has become a useful tool for the diagnosis of Trypanosoma cruzi infection. The development of automated DNA extraction methodologies and PCR systems is an important step toward the standardization of protocols in routine diagnosis. To date, there are only two commercially available Real-Time PCR assays for the routine laboratory detection of T. cruzi DNA in clinical samples: TCRUZIDNA.CE (Diagnostic Bioprobes Srl) and RealCycler CHAG (Progenie Molecular). Our aim was to evaluate the RealCycler CHAG assay taking into account the whole process. We assessed the usefulness of an automated DNA extraction system based on magnetic particles (EZ1 Virus Mini Kit v2.0, Qiagen) combined with a commercially available Real-Time PCR assay targeting satellite DNA (SatDNA) of T. cruzi (RealCycler CHAG), a methodology used for routine diagnosis in our hospital. It was compared with a well-known strategy combining a commercial DNA isolation kit based on silica columns (High Pure PCR Template Preparation Kit, Roche Diagnostics) with an in-house Real-Time PCR targeting SatDNA. The results of the two methodologies were in almost perfect agreement, indicating they can be used interchangeably. However, when variations in protocol factors were applied (sample treatment, extraction method and Real-Time PCR), the results were less convincing. A comprehensive fine-tuning of the whole procedure is the key to successful results. Guanidine EDTA-blood (GEB) samples are not suitable for DNA extraction based on magnetic particles due to inhibition, at least when samples are not processed immediately. This is the first study to evaluate the RealCycler CHAG assay taking into account the overall process, including three variables (sample treatment, extraction method and Real-Time PCR). Our findings may contribute to the harmonization of protocols between laboratories and to a wider application of Real-Time PCR in molecular diagnostic laboratories associated with health centers.
Analysis of on-line clinical laboratory manuals and practical recommendations.
Beckwith, Bruce; Schwartz, Robert; Pantanowitz, Liron
2004-04-01
On-line clinical laboratory manuals are a valuable resource for medical professionals. To our knowledge, no recommendations currently exist for their content or design. To analyze publicly accessible on-line clinical laboratory manuals and to propose guidelines for their content. We conducted an Internet search for clinical laboratory manuals written in English with individual test listings. Four individual test listings in each manual were evaluated for 16 data elements, including sample requirements, test methodology, units of measure, reference range, and critical values. Web sites were also evaluated for supplementary information and search functions. We identified 48 on-line laboratory manuals, including 24 academic or community hospital laboratories and 24 commercial or reference laboratories. All manuals had search engines and/or test indices. No single manual contained all 16 data elements evaluated. An average of 8.9 (56%) elements were present (range, 4-14). Basic sample requirements (specimen and volume needed) were the elements most commonly present (98% of manuals). The frequency of the remaining data elements varied from 10% to 90%. On-line clinical laboratory manuals originate from both hospital and commercial laboratories. While most manuals were user-friendly and contained adequate specimen-collection information, other important elements, such as reference ranges, were frequently absent. To ensure that clinical laboratory manuals are of maximal utility, we propose the following 13 data elements be included in individual test listings: test name, synonyms, test description, test methodology, sample requirements, volume requirements, collection guidelines, transport guidelines, units of measure, reference range, critical values, test availability, and date of latest revision.
Chaudhary, Rajendra; Dubey, Anju; Sonker, Atul
2017-01-01
Blood donor hemoglobin (Hb) estimation is an important donation test that is performed prior to blood donation. It serves the dual purpose of protecting the donors' health against anemia and ensuring good quality of blood components, which has an implication on recipients' health. Diverse cutoff criteria have been defined world over depending on population characteristics; however, no testing methodology and sample requirement have been specified for Hb screening. Besides the technique, there are several physiological and methodological factors that affect accuracy and reliability of Hb estimation. These include the anatomical source of blood sample, posture of the donor, timing of sample and several other biological factors. Qualitative copper sulfate gravimetric method has been the archaic time-tested method that is still used in resource-constrained settings. Portable hemoglobinometers are modern quantitative devices that have been further modified to reagent-free cuvettes. Furthermore, noninvasive spectrophotometry was introduced, mitigating pain to blood donor and eliminating risk of infection. Notwithstanding a tremendous evolution in terms of ease of operation, accuracy, mobility, rapidity and cost, a component of inherent variability persists, which may partly be attributed to pre-analytical variables. Hence, blood centers should pay due attention to validation of test methodology, competency of operating staff and regular proficiency testing of the outputs. In this article, we have reviewed various regulatory guidelines, described the variables that affect the measurements and compared the validated technologies for Hb screening of blood donors along with enumeration of their merits and limitations.
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Quantitation of DNA adducts by stable isotope dilution mass spectrometry
Tretyakova, Natalia; Goggin, Melissa; Janis, Gregory
2012-01-01
Exposure to endogenous and exogenous chemicals can lead to the formation of structurally modified DNA bases (DNA adducts). If not repaired, these nucleobase lesions can cause polymerase errors during DNA replication, leading to heritable mutations potentially contributing to the development of cancer. Due to their critical role in cancer initiation, DNA adducts represent mechanism-based biomarkers of carcinogen exposure, and their quantitation is particularly useful for cancer risk assessment. DNA adducts are also valuable in mechanistic studies linking tumorigenic effects of environmental and industrial carcinogens to specific electrophilic species generated from their metabolism. While multiple experimental methodologies have been developed for DNA adduct analysis in biological samples – including immunoassay, HPLC, and 32P-postlabeling – isotope dilution high performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) generally has superior selectivity, sensitivity, accuracy, and reproducibility. As typical DNA adducts concentrations in biological samples are between 0.01 – 10 adducts per 108 normal nucleotides, ultrasensitive HPLC-ESI-MS/MS methodologies are required for their analysis. Recent developments in analytical separations and biological mass spectrometry – especially nanoflow HPLC, nanospray ionization MS, chip-MS, and high resolution MS – have pushed the limits of analytical HPLC-ESI-MS/MS methodologies for DNA adducts, allowing researchers to accurately measure their concentrations in biological samples from patients treated with DNA alkylating drugs and in populations exposed to carcinogens from urban air, drinking water, cooked food, alcohol, and cigarette smoke. PMID:22827593
Ecological Momentary Assessment is a Neglected Methodology in Suicidology.
Davidson, Collin L; Anestis, Michael D; Gutierrez, Peter M
2017-01-02
Ecological momentary assessment (EMA) is a group of research methods that collect data frequently, in many contexts, and in real-world settings. EMA has been fairly neglected in suicidology. The current article provides an overview of EMA for suicidologists including definitions, data collection considerations, and different sampling strategies. Next, the benefits of EMA in suicidology (i.e., reduced recall bias, accurate tracking of fluctuating variables, testing assumptions of theories, use in interventions), participant safety considerations, and examples of published research that investigate self-directed violence variables using EMA are discussed. The article concludes with a summary and suggested directions for EMA research in suicidology with the particular aim to spur the increased use of this methodology among suicidologists.
Yoshikawa, Hirokazu; Weisner, Thomas S; Kalil, Ariel; Way, Niobe
2008-03-01
Multiple methods are vital to understanding development as a dynamic, transactional process. This article focuses on the ways in which quantitative and qualitative methodologies can be combined to enrich developmental science and the study of human development, focusing on the practical questions of "when" and "how." Research situations that may be especially suited to mixing qualitative and quantitative approaches are described. The authors also discuss potential choices for using mixed quantitative- qualitative approaches in study design, sampling, construction of measures or interview protocols, collaborations, and data analysis relevant to developmental science. Finally, they discuss some common pitfalls that occur in mixing these methods and include suggestions for surmounting them.
Evaluation of errors in quantitative determination of asbestos in rock
NASA Astrophysics Data System (ADS)
Baietto, Oliviero; Marini, Paola; Vitaliti, Martina
2016-04-01
The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must be made through the repetition of the same analysis on the same sample to try to estimate the error on the representativeness of the sample and the error related to the sensitivity of the operator, in order to provide a sufficiently reliable uncertainty of the method. We used about 30 natural rock samples with different asbestos content, performing 3 analysis on each sample to obtain a trend sufficiently representative of the percentage. Furthermore we made on one chosen sample 10 repetition of the analysis to try to define more specifically the error of the methodology.
NASA Technical Reports Server (NTRS)
Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.
2003-01-01
This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.
THE IMPACT OF PASSIVE SAMPLING METHODOLOGIES USED IN THE DEARS
This abstract details the use of passive sampling methodologies in the Detroit Exposure and Aerosol Research Study (DEARS). A discussion about the utility of various gas-phase passive samplers used in the study will be described along with examples of field data measurements empl...
Charan, J; Saxena, D
2014-01-01
Biased negative studies not only reflect poor research effort but also have an impact on 'patient care' as they prevent further research with similar objectives, leading to potential research areas remaining unexplored. Hence, published 'negative studies' should be methodologically strong. All parameters that may help a reader to judge validity of results and conclusions should be reported in published negative studies. There is a paucity of data on reporting of statistical and methodological parameters in negative studies published in Indian Medical Journals. The present systematic review was designed with an aim to critically evaluate negative studies published in prominent Indian Medical Journals for reporting of statistical and methodological parameters. Systematic review. All negative studies published in 15 Science Citation Indexed (SCI) medical journals published from India were included in present study. Investigators involved in the study evaluated all negative studies for the reporting of various parameters. Primary endpoints were reporting of "power" and "confidence interval." Power was reported in 11.8% studies. Confidence interval was reported in 15.7% studies. Majority of parameters like sample size calculation (13.2%), type of sampling method (50.8%), name of statistical tests (49.1%), adjustment of multiple endpoints (1%), post hoc power calculation (2.1%) were reported poorly. Frequency of reporting was more in clinical trials as compared to other study designs and in journals having impact factor more than 1 as compared to journals having impact factor less than 1. Negative studies published in prominent Indian medical journals do not report statistical and methodological parameters adequately and this may create problems in the critical appraisal of findings reported in these journals by its readers.
NASA Astrophysics Data System (ADS)
Li, Z.; Che, W.; Frey, H. C.; Lau, A. K. H.
2016-12-01
Portable air monitors are currently being developed and used to enable a move towards exposure monitoring as opposed to fixed site monitoring. Reliable methods are needed regarding capturing spatial and temporal variability in exposure concentration to obtain credible data from which to develop efficient exposure mitigation measures. However, there are few studies that quantify the validity and repeatability of the collected data. The objective of this study is to present and evaluate a collocated exposure monitoring (CEM) methodology including the calibration of portable air monitors against stationary reference equipment, side-by-side comparison of portable air monitors, personal or microenvironmental exposure monitoring and the processing and interpretation of the collected data. The CEM methodology was evaluated based on application to portable monitors TSI DustTrak II Aerosol Monitor 8530 for fine particulate matter (PM2.5) and TSI Q-Trak model 7575 with probe model 982 for CO, CO2, temperature and relative humidity. Taking a school sampling campaign in Hong Kong in January and June, 2015 as an example, the calibrated side-by-side measured 1 Hz PM2.5 concentrations showed good consistency between two sets of portable air monitors. Confidence in side-by-side comparison, PM2.5 concentrations of which most of the time were within 2 percent, enabled robust inference regarding differences when the monitors measured in classroom and pedestrian during school hour. The proposed CEM methodology can be widely applied in sampling campaigns with the objective of simultaneously characterizing pollutant concentrations in two or more locations or microenvironments. The further application of the CEM methodology to transportation exposure will be presented and discussed.
The complexity of personality: advantages of a genetically sensitive multi-group design.
Hahn, Elisabeth; Spinath, Frank M; Siedler, Thomas; Wagner, Gert G; Schupp, Jürgen; Kandler, Christian
2012-03-01
Findings from many behavioral genetic studies utilizing the classical twin design suggest that genetic and non-shared environmental effects play a significant role in human personality traits. This study focuses on the methodological advantages of extending the sampling frame to include multiple dyads of relatives. We investigated the sensitivity of heritability estimates to the inclusion of sibling pairs, mother-child pairs and grandparent-grandchild pairs from the German Socio-Economic Panel Study in addition to a classical German twin sample consisting of monozygotic- and dizygotic twins. The resulting dataset contained 1.308 pairs, including 202 monozygotic and 147 dizygotic twin pairs, along with 419 sibling pairs, 438 mother-child dyads, and 102 grandparent-child dyads. This genetically sensitive multi-group design allowed the simultaneous testing of additive and non-additive genetic, common and specific environmental effects, including cultural transmission and twin-specific environmental influences. Using manifest and latent modeling of phenotypes (i.e., controlling for measurement error), we compare results from the extended sample with those from the twin sample alone and discuss implications for future research.
Distribution of Amino Acids in Lunar Regolith
NASA Technical Reports Server (NTRS)
Elsila, J. E.; Callahan, M. P.; Glavin, D. P.; Dworkin, J. P.; Noble, S. K.; Gibson, E. K., Jr.
2014-01-01
One of the most eagerly studied questions upon initial return of lunar samples was whether significant amounts of organic compounds, including amino acids, were present. Analyses during the 1970s produced only tentative and inconclusive identifications of indigenous amino acids. Those analyses were hampered by analytical difficulties including relative insensitivity to certain compounds, the inability to separate chiral enantiomers, and the lack of compound-specific isotopic measurements, which made it impossible to determine whether the detected amino acids were indigenous to the lunar samples or the results of contamination. Numerous advances have been made in instrumentation and methodology for amino acid characterization in extraterrestrial samples in the intervening years, yet the origin of amino acids in lunar regolith samples has been revisited only once for a single lunar sample, (3) and remains unclear. Here, we present initial data from the analyses of amino acid abundances in 12 lunar regolith samples. We discuss these abundances in the context of four potential amino acid sources: (1) terrestrial biological contamination; (2) contamination from lunar module (LM) exhaust; (3) derivation from solar windimplanted precursors; and (4) exogenous delivery from meteorites.
Chip-LC-MS for label-free profiling of human serum.
Horvatovich, Peter; Govorukhina, Natalia I; Reijmers, Theo H; van der Zee, Ate G J; Suits, Frank; Bischoff, Rainer
2007-12-01
The discovery of biomarkers in easily accessible body fluids such as serum is one of the most challenging topics in proteomics requiring highly efficient separation and detection methodologies. Here, we present the application of a microfluidics-based LC-MS system (chip-LC-MS) to the label-free profiling of immunodepleted, trypsin-digested serum in comparison to conventional capillary LC-MS (cap-LC-MS). Both systems proved to have a repeatability of approximately 20% RSD for peak area, all sample preparation steps included, while repeatability of the LC-MS part by itself was less than 10% RSD for the chip-LC-MS system. Importantly, the chip-LC-MS system had a two times higher resolution in the LC dimension and resulted in a lower average charge state of the tryptic peptide ions generated in the ESI interface when compared to cap-LC-MS while requiring approximately 30 times less (~5 pmol) sample. In order to characterize both systems for their capability to find discriminating peptides in trypsin-digested serum samples, five out of ten individually prepared, identical sera were spiked with horse heart cytochrome c. A comprehensive data processing methodology was applied including 2-D smoothing, resolution reduction, peak picking, time alignment, and matching of the individual peak lists to create an aligned peak matrix amenable for statistical analysis. Statistical analysis by supervised classification and variable selection showed that both LC-MS systems could discriminate the two sample groups. However, the chip-LC-MS system allowed to assign 55% of the overall signal to selected peaks against 32% for the cap-LC-MS system.
[Theoretical and methodological uses of research in Social and Human Sciences in Health].
Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein
2012-12-01
The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.
Kendall, Carl; Kerr, Ligia R F S; Gondim, Rogerio C; Werneck, Guilherme L; Macena, Raimunda Hermelinda Maia; Pontes, Marta Kerr; Johnston, Lisa G; Sabin, Keith; McFarland, Willi
2008-07-01
Obtaining samples of populations at risk for HIV challenges surveillance, prevention planning, and evaluation. Methods used include snowball sampling, time location sampling (TLS), and respondent-driven sampling (RDS). Few studies have made side-by-side comparisons to assess their relative advantages. We compared snowball, TLS, and RDS surveys of men who have sex with men (MSM) in Forteleza, Brazil, with a focus on the socio-economic status (SES) and risk behaviors of the samples to each other, to known AIDS cases and to the general population. RDS produced a sample with wider inclusion of lower SES than snowball sampling or TLS-a finding of health significance given the majority of AIDS cases reported among MSM in the state were low SES. RDS also achieved the sample size faster and at lower cost. For reasons of inclusion and cost-efficiency, RDS is the sampling methodology of choice for HIV surveillance of MSM in Fortaleza.
Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo
2013-10-24
In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Coleman, Laci S.; Ford, W. Mark; Dobony, Christopher A.; Britzke, Eric R.
2014-01-01
Concomitant with the emergence and spread of white-nose syndrome (WNS) and precipitous decline of many bat species in North America, natural resource managers need modified and/or new techniques for bat inventory and monitoring that provide robust occupancy estimates. We used Anabat acoustic detectors to determine the most efficient passive acoustic sampling design for optimizing detection probabilities of multiple bat species in a WNS-impacted environment in New York, USA. Our sampling protocol included: six acoustic stations deployed for the entire duration of monitoring as well as a 4 x 4 grid and five transects of 5-10 acoustic units that were deployed for 6-8 night sample durations surveyed during the summers of 2011-2012. We used Program PRESENCE to determine detection probability and site occupancy estimates. Overall, the grid produced the highest detection probabilities for most species because it contained the most detectors and intercepted the greatest spatial area. However, big brown bats (Eptesicus fuscus) and species not impacted by WNS were detected easily regardless of sampling array. Endangered Indiana (Myotis sodalis) and little brown (Myotis lucifugus) and tri-colored bats (Perimyotis subflavus) showed declines in detection probabilities over our study, potentially indicative of continued WNS-associated declines. Identification of species presence through efficient methodologies is vital for future conservation efforts as bat populations decline further due to WNS and other factors.
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis
Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín
2010-01-01
Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... determine endpoints; questionnaire design and analyses; and presentation of survey results. To date, FDA has..., the workshop will invest considerable time in identifying best methodological practices for conducting... sample, sample size, question design, process, and endpoints. Panel 2 will focus on alternatives to...
Byun, Min Soo; Yi, Dahyun; Lee, Jun Ho; Choe, Young Min; Sohn, Bo Kyung; Lee, Jun-Young; Choi, Hyo Jung; Baek, Hyewon; Kim, Yu Kyeong; Lee, Yun-Sang; Sohn, Chul-Ho; Mook-Jung, Inhee; Choi, Murim; Lee, Yu Jin; Lee, Dong Woo; Ryu, Seung-Ho; Kim, Shin Gyeom; Kim, Jee Wook; Woo, Jong Inn; Lee, Dong Young
2017-11-01
The Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's disease (KBASE) aimed to recruit 650 individuals, aged from 20 to 90 years, to search for new biomarkers of Alzheimer's disease (AD) and to investigate how multi-faceted lifetime experiences and bodily changes contribute to the brain changes or brain pathologies related to the AD process. All participants received comprehensive clinical and neuropsychological evaluations, multi-modal brain imaging, including magnetic resonance imaging, magnetic resonance angiography, [ 11 C]Pittsburgh compound B-positron emission tomography (PET), and [ 18 F]fluorodeoxyglucose-PET, blood and genetic marker analyses at baseline, and a subset of participants underwent actigraph monitoring and completed a sleep diary. Participants are to be followed annually with clinical and neuropsychological assessments, and biannually with the full KBASE assessment, including neuroimaging and laboratory tests. As of March 2017, in total, 758 individuals had volunteered for this study. Among them, in total, 591 participants-291 cognitively normal (CN) old-aged individuals, 74 CN young- and middle-aged individuals, 139 individuals with mild cognitive impairment (MCI), and 87 individuals with AD dementia (ADD)-were enrolled at baseline, after excluding 162 individuals. A subset of participants (n=275) underwent actigraph monitoring. The KBASE cohort is a prospective, longitudinal cohort study that recruited participants with a wide age range and a wide distribution of cognitive status (CN, MCI, and ADD) and it has several strengths in its design and methodologies. Details of the recruitment, study methodology, and baseline sample characteristics are described in this paper.
Measuring the sea: the first oceanographic cruise (1679-1680) and the roots of oceanography
NASA Astrophysics Data System (ADS)
Pinardi, Nadia; Özsoy, Emin; Latif, Mohammed Abdul; Moroni, Franca; Grandi, Alessandro; Manzella, Giuseppe; De Strobel, Federico; Lyubartsev, Vladyslav
2017-04-01
The first quantitative measurements of seawater properties were carried out by Count Luigi Ferdinando Marsili in a cruise between 1679 and 1680 in the Aegean Sea, Marmara Sea and the Bosphorus Strait. The data reported in the historical oceanographic treatise "Osservazioni intorno al Bosforo Tracio" (Marsili, 1681) allowed us to reconstruct the seawater density at different geographic locations in 1679-1680. The Marsili experimental methodology included the collection of surface and deep water samples from the ship, the analysis of the samples with a hydrostatic ampoule and the choice of a reference water to standardize the measurements. Reconstructed densities comparison with present day values show agreement within a 10-20% uncertainty owing to some aspects of the measurement methodology which are difficult to reconstruct from the documentary evidence. The experimental data collected in the Bosphorus allowed Marsili to enunciate a theory on the cause of the two-layer flow at the Strait, thereafter confirmed by many laboratory and numerical studies.
Strategies to address participant misrepresentation for eligibility in Web-based research.
Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark
2014-03-01
Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacy, Shaw Nozaki, E-mail: shaw.lacy@gmail.com; Departmento de Ecosistemas y Medio Ambiente, Pontificia Universidad Católica de Chile, Vicuña Mackenna 4860, Macul; Centro Interdisciplinario de Cambio Global, Vicuña Mackenna 4860, Macul
Chile was one of many countries that initiated environmental impact assessments in the 1990s, and has relied on their use for species conservation and territorial planning without the use of larger-scale environmental and ecological planning. The capacity of Chile's environmental impact assessment system (SEIA) to evaluate resident freshwater fishes and the potential impacts of water projects and aquaculture activities – two categories of projects that create direct threats to freshwater fishes – are assessed. Of the 3997 such submissions to the SEIA, only 0.6% conducted any freshwater fish assessment, and only 0.1% conducted any quantitative assessment of expected impacts frommore » the associated project. The small number of assessments was characterized by poor study design, inconsistent sampling methodology, and species misidentification. Traditional assessments failed to include freshwater fish ecology in the general assessment framework. The new strategic environmental evaluation system only underscores the need for vastly improved field sampling protocols and assessment methodologies.« less
Cai, Dan; Stone, Teresa E; Petrini, Marcia A; McMillan, Margaret
2016-03-01
Q-methodology was used to investigate the health beliefs of Chinese clinical nurses and nurse academics. Twenty-eight participants from one hospital and nursing school in China were involved. The four stages of this study included: (i) concourse development from literature review, Internet searches, and key informant interviews; (ii) A pilot study to develop the Q-sample from the concourse; (iii) participants sorted the Q-sample statements along a continuum of preference (Q-sorting); and (iv) PQ data analysis using principal component analysis and varimax rotation. Five viewpoints were revealed: (i) factor 1--health management and the importance of evidence; (ii) factor 2--challenging local cultural belief, and Eastern and Western influences; (iii) factor 3--commonsense; (iv) factor 4--health and clinical practice; and (v) factor 5--health and nursing education. This study presents a need for nurses and nurse academics to think critically, examine their long-held health beliefs, and promote the use of evidence-based practice. © 2016 Wiley Publishing Asia Pty Ltd.
Pelvic floor muscle training protocol for stress urinary incontinence in women: A systematic review.
Oliveira, Marlene; Ferreira, Margarida; Azevedo, Maria João; Firmino-Machado, João; Santos, Paula Clara
2017-07-01
Strengthening exercises for pelvic floor muscles (SEPFM) are considered the first approach in the treatment of stress urinary incontinence (SUI). Nevertheless, there is no evidence about training parameters. To identify the protocol and/or most effective training parameters in the treatment of female SUI. A literature research was conducted in the PubMed, Cochrane Library, PEDro, Web of Science and Lilacs databases, with publishing dates ranging from January 1992 to March 2014. The articles included consisted of English-speaking experimental studies in which SEPFM were compared with placebo treatment (usual or untreated). The sample had a diagnosis of SUI and their age ranged between 18 and 65 years. The assessment of methodological quality was performed based on the PEDro scale. Seven high methodological quality articles were included in this review. The sample consisted of 331 women, mean age 44.4±5.51 years, average duration of urinary loss of 64±5.66 months and severity of SUI ranging from mild to severe. SEPFM programs included different training parameters concerning the PFM. Some studies have applied abdominal training and adjuvant techniques. Urine leakage cure rates varied from 28.6 to 80%, while the strength increase of PFM varied from 15.6 to 161.7%. The most effective training protocol consists of SEPFM by digital palpation combined with biofeedback monitoring and vaginal cones, including 12 week training parameters, and ten repetitions per series in different positions compared with SEPFM alone or a lack of treatment.
Quantitation and detection of vanadium in biologic and pollution materials
NASA Technical Reports Server (NTRS)
Gordon, W. A.
1974-01-01
A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.
Draw-and-Write Technique Elicits Children's Perceptions of Health in the USA and Guatemala
ERIC Educational Resources Information Center
Renslow, Jillian; Maupin, Jonathan
2018-01-01
Objective: Using the draw-and-write methodology, this study examined cross-cultural similarities and differences in children's perceptions of health. Design: Cross-sectional design. Setting: One public elementary school in the USA and in Guatemala. Method: The total sample included 161 children 9-10 years of age, 80 in the USA and 81 in Guatemala.…
Andrew Moldenke; Becky Fichter
1988-01-01
A fully illustrated key is presented for identifying genera of oribatid mites known from or suspected of occurring in the Pacific Northwest. The manual includes an introduction detailing sampling methodology; an illustrated glossary of all terminology used; two color plates of all taxa from the H. J. Andrews Experimental Forest; a diagrammatic key to the 16 major...
ERIC Educational Resources Information Center
Çetin,Baris
2015-01-01
The aim of the present study was to investigate the differences in teacher candidates' metacognitive skills analyzed according to the year of study in their undergraduate program they were in. The research methodology in the study was survey. Among survey types, the cross-sectional design was used. The sample of the study included a total of 1072…
Assessing technical performance at diverse ambulatory care sites.
Osterweis, M; Bryant, E
1978-01-01
The purpose of the large study reported here was to develop and test methods for assessing the quality of health care that would be broadly applicable to diverse ambulatory care organizations for periodic comparative review. Methodological features included the use of an age-sex stratified random sampling scheme, dependence on medical records as the source of data, a fixed study period year, use of Kessner's tracer methodology (including not only acute and chronic diseases but also screening and immunization rates as indicators), and a fixed tracer matrix at all test sites. This combination of methods proved more efficacious in estimating certain parameters for the total patient populations at each site (including utilization patterns, screening, and immunization rates) and the process of care for acute conditions than it did in examining the process of care for the selected chronic condition. It was found that the actual process of care at all three sites for the three acute conditions (streptococcal pharyngitis, urinary tract infection, and iron deficiency anemia) often differed from the expected process in terms of both diagnostic procedures and treatment. For hypertension, the chronic disease tracer, medical records were frequently a deficient data source from which to draw conclusions about the adequacy of treatment. Several aspects of the study methodology were found to be detrimental to between-site comparisons of the process of care for chronic disease management. The use of an age-sex stratified random sampling scheme resulted in the identification of too few cases of hypertension at some sites for analytic purposes, thereby necessitating supplementary sampling by diagnosis. The use of a fixed study period year resulted in an arbitrary starting point in the course of the disease. Furthermore, in light of the diverse sociodemographic characteristics of the patient populations, the use of a fixed matrix of tracer conditions for all test sites is questionable. The discussion centers on these and other problems encountered in attempting to compare technical performance within diverse ambulatory care organizations and provides some guidelines as to the utility of alternative methods for assessing the quality of health care.
Quantitative x-ray diffraction mineralogy of Los Angeles basin core samples
Hein, James R.; McIntyre, Brandie R.; Edwards, Brian D.; Lakota, Orion I.
2006-01-01
This report contains X-ray diffraction (XRD) analysis of mineralogy for 81 sediment samples from cores taken from three drill holes in the Los Angeles Basin in 2000-2001. We analyzed 26 samples from Pier F core, 29 from Pier C core, and 26 from the Webster core. These three sites provide an offshore-onshore record across the Southern California coastal zone. This report is designed to be a data repository; these data will be used in further studies, including geochemical modeling as part of the CABRILLO project. Summary tables quantify the major mineral groups, whereas detailed mineralogy is presented in three appendices. The rationale, methodology, and techniques are described in the following paper.
Spindler, Patrice; Paretti, Nick V.
2007-01-01
The Arizona Department of Environmental Quality (ADEQ) and the U.S. Environmental Protection Agency (USEPA) Ecological Monitoring and Assessment Program (EMAP), use different field methods for collecting macroinvertebrate samples and habitat data for bioassessment purposes. Arizona’s Biocriteria index was developed using a riffle habitat sampling methodology, whereas the EMAP method employs a multi-habitat sampling protocol. There was a need to demonstrate comparability of these different bioassessment methodologies to allow use of the EMAP multi-habitat protocol for both statewide probabilistic assessments for integration of the EMAP data into the national (305b) assessment and for targeted in-state bioassessments for 303d determinations of standards violations and impaired aquatic life conditions. The purpose of this study was to evaluate whether the two methods yield similar bioassessment results, such that the data could be used interchangeably in water quality assessments. In this Regional EMAP grant funded project, a probabilistic survey of 30 sites in the Little Colorado River basin was conducted in the spring of 2007. Macroinvertebrate and habitat data were collected using both ADEQ and EMAP sampling methods, from adjacent reaches within these stream channels.
All analyses indicated that the two macroinvertebrate sampling methods were significantly correlated. ADEQ and EMAP samples were classified into the same scoring categories (meeting, inconclusive, violating the biocriteria standard) 82% of the time. When the ADEQ-IBI was applied to both the ADEQ and EMAP taxa lists, the resulting IBI scores were significantly correlated (r=0.91), even though only 4 of the 7 metrics in the IBI were significantly correlated. The IBI scores from both methods were significantly correlated to the percent of riffle habitat, even though the average percent riffle habitat was only 30% of the stream reach. Multivariate analyses found that the percent riffle was an important attribute for both datasets in classifying IBI scores into assessment categories.
Habitat measurements generated from EMAP and ADEQ methods were also significantly correlated; 13 of 16 habitat measures were significantly correlated (p<0.01). The visual-based percentage estimates of percent riffle and pool habitats, vegetative cover and percent canopy cover, and substrate measurements of percent fine substrate and embeddedness were all remarkably similar, given the different field methods used. A multivariate analysis identified substrate and flow conditions, as well as canopy cover as important combinations of habitat attributes affecting both IBI scores. These results indicate that similar habitat measures can be obtained using two different field sampling protocols. In addition, similar combinations of these habitat parameters were important to macroinvertebrate community condition in multivariate analyses of both ADEQ and EMAP datasets.
These results indicate the two sampling methods for macroinvertebrates and habitat data were very similar in terms of bioassessment results and stressors. While the bioassessment category was not identical for all sites, overall the assessments were significantly correlated, providing similar bioassessment results for the cold water streams used in this study. The findings of this study indicate that ADEQ can utilize either a riffle-based sampling methodology or a multi-habitat sampling approach in cold water streams as both yield similar results relative to the macroinvertebrate assemblage. These results will allow for use of either macroinvertebrate dataset to determine water quality standards compliance with the ADEQ Indexes of Biological Integrity, for which threshold values were just recently placed into the Arizona Surface Water Quality Standards. While this survey did not include warm water desert streams of Arizona, we would predict that EMAP and ADEQ sampling methodologies would provide similar bioassessment results and would not be significantly different, as we have found that the percent riffle habitat in cold and warm water perennial, wadeable streams is not significantly different. However, a comparison study of sampling methodologies in warm water streams should be conducted to confirm the predicted similarity of bioassessment results. ADEQ will continue to implement a monitoring strategy that includes probabilistic monitoring for a statewide ecological assessment of stream conditions. Conclusions from this study will guide decisions regarding the most appropriate sampling methods for future probabilistic monitoring sample plans.
Validation of a sampling plan to generate food composition data.
Sammán, N C; Gimenez, M A; Bassett, N; Lobo, M O; Marcoleri, M E
2016-02-15
A methodology to develop systematic plans for food sampling was proposed. Long life whole and skimmed milk, and sunflower oil were selected to validate the methodology in Argentina. Fatty acid profile in all foods, proximal composition, and calcium's content in milk were determined with AOAC methods. The number of samples (n) was calculated applying Cochran's formula with variation coefficients ⩽12% and an estimate error (r) maximum permissible ⩽5% for calcium content in milks and unsaturated fatty acids in oil. n were 9, 11 and 21 for long life whole and skimmed milk, and sunflower oil respectively. Sample units were randomly collected from production sites and sent to labs. Calculated r with experimental data was ⩽10%, indicating high accuracy in the determination of analyte content of greater variability and reliability of the proposed sampling plan. The methodology is an adequate and useful tool to develop sampling plans for food composition analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Li, Honghe; Ding, Ning; Zhang, Yuanyuan; Liu, Yang; Wen, Deliang
2017-01-01
Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments' measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990-2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument's usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee's criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar's instrument for nursing students, Nurse Practitioners' Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies.
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj
2011-01-01
Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985
Food-service establishment wastewater characterization.
Lesikar, B J; Garza, O A; Persyn, R A; Kenimer, A L; Anderson, M T
2006-08-01
Food-service establishments that use on-site wastewater treatment systems are experiencing pretreatment system and/or drain field hydraulic and/or organic overloading. This study included characterization of four wastewater parameters (five-day biochemical oxygen demand [BOD5]; total suspended solids [TSS]; food, oil, and grease [FOG]; and flow) from 28 restaurants located in Texas during June, July, and August 2002. The field sampling methodology included taking a grab sample from each restaurant for 6 consecutive days at approximately the same time each day, followed by a 2-week break, and then sampling again for another 6 consecutive days, for a total of 12 samples per restaurant and 336 total observations. The analysis indicates higher organic (BOD5) and hydraulic values for restaurants than those typically found in the literature. The design values for this study for BOD5, TSS, FOG, and flow were 1523, 664, and 197 mg/L, and 96 L/day-seat respectively, which captured over 80% of the data collected.
Plant gum identification in historic artworks
Granzotto, Clara; Arslanoglu, Julie; Rolando, Christian; Tokarski, Caroline
2017-01-01
We describe an integrated and straightforward new analytical protocol that identifies plant gums from various sample sources including cultural heritage. Our approach is based on the identification of saccharidic fingerprints using mass spectrometry following controlled enzymatic hydrolysis. We developed an enzyme cocktail suitable for plant gums of unknown composition. Distinctive MS profiles of gums such as arabic, cherry and locust-bean gums were successfully identified. A wide range of oligosaccharidic combinations of pentose, hexose, deoxyhexose and hexuronic acid were accurately identified in gum arabic whereas cherry and locust bean gums showed respectively PentxHexy and Hexn profiles. Optimized for low sample quantities, the analytical protocol was successfully applied to contemporary and historic samples including ‘Colour Box Charles Roberson & Co’ dating 1870s and drawings from the American painter Arthur Dove (1880–1946). This is the first time that a gum is accurately identified in a cultural heritage sample using structural information. Furthermore, this methodology is applicable to other domains (food, cosmetic, pharmaceutical, biomedical). PMID:28425501
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
A method for deriving water-quality benchmarks using field data.
Cormier, Susan M; Suter, Glenn W
2013-02-01
The authors describe a methodology that characterizes effects to individual genera observed in the field and estimate the concentration at which 5% of genera are adversely affected. Ionic strength, measured as specific conductance, is used to illustrate the methodology. Assuming some resilience in the population, 95% of the genera are afforded protection. The authors selected an unambiguous effect, the presence or absence of a genus from sampling locations. The absence of a genus, extirpation, is operationally defined as the point above which only 5% of the observations of a genus occurs. The concentrations that cause extirpation of each genus are rank-ordered from least to greatest, and the benchmark is estimated at the 5th percentile of the distribution using two-point interpolation. When a full range of exposures and many taxa are included in the model of taxonomic sensitivity, the model broadly characterizes how species in general respond to a concentration gradient of the causal agent. This recognized U.S. Environmental Protection Agency methodology has many advantages. Observations from field studies include the full range of conditions, effects, species, and interactions that occur in the environment and can be used to model some causal relationships that laboratory studies cannot. Copyright © 2012 SETAC.
Horliana, Anna Carolina Ratto Tempestini; Chambrone, Leandro; Foz, Adriana Moura; Artese, Hilana Paula Carillo; Rabelo, Mariana de Sousa; Pannuti, Cláudio Mendes; Romito, Giuseppe Alexandre
2014-01-01
Background To date, there is no compilation of evidence-based information associating bacteremia and periodontal procedures. This systematic review aims to assess magnitude, duration, prevalence and nature of bacteremia caused by periodontal procedures. Study Design Systematic Review Types of Studies Reviewed MEDLINE, EMBASE and LILACS databases were searched in duplicate through August, 2013 without language restriction. Observational studies were included if blood samples were collected before, during or after periodontal procedures of patients with periodontitis. The methodological quality was assessed in duplicate using the modified Newcastle-Ottawa scale (NOS). Results Search strategy identified 509 potentially eligible articles and nine were included. Only four studies demonstrated high methodological quality, whereas five were of medium or low methodological quality. The study characteristics were considered too heterogeneous to conduct a meta-analysis. Among 219 analyzed patients, 106 (49.4%) had positive bacteremia. More frequent bacteria were S. viridans, A. actinomycetemcomitans P. gingivalis, M. micros and species Streptococcus and Actinomyces, although identification methods of microbiologic assays were different among studies. Clinical Implications Although half of the patients presented positive bacteremia after periodontal procedures, accurate results regarding the magnitude, duration and nature of bacteremia could not be confidentially assessed. PMID:24870125
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson-Grant, Amy
Postwave tracking study for the Energy Efficiency Adult Campaign This study serves as measure of key metrics among the campaign’s target audience, homeowners age 25+. Key measures include: Awareness of messages relating to the broad issue; Recognition of the PSAs; Relevant attitudes, including interest, ease of taking energy efficient steps, and likelihood to act; Relevant knowledge, including knowledge of light bulb alternatives and energy efficient options; and Relevant behaviors, including specific energy-saving behaviors mentioned within the PSAs. Wave 1: May 27 – June 7, 2011 Wave 2: May 29 – June 8, 2012 Wave 3: May 29 – June 19,more » 2014 General market sample of adults 25+ who own their homes W1 sample: n = 704; W2: n=701; W3: n=806 Online Survey Panel Methodology Study was fielded by Lightspeed Research among their survey panel. Sample is US Census representative of US homeowners by race/ethnicity, income, age, region, and family status. At least 30% of respondents were required to have not updated major appliances in their home in the past 5 years (dishwasher, stove, refrigerator, washer, or dryer).« less
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-09-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-01-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655
A systematic examination of a random sampling strategy for source apportionment calculations.
Andersson, August
2011-12-15
Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.
Alteration of histological gastritis after cure of Helicobacter pylori infection.
Hojo, M; Miwa, H; Ohkusa, T; Ohkura, R; Kurosawa, A; Sato, N
2002-11-01
It is still disputed whether gastric atrophy or intestinal metaplasia improves after the cure of Helicobacter pylori infection. To clarify the histological changes after the cure of H. pylori infection through a literature survey. Fifty-one selected reports from 1066 relevant articles were reviewed. The extracted data were pooled according to histological parameters of gastritis based on the (updated) Sydney system. Activity improved more rapidly than inflammation. Eleven of 25 reports described significant improvement of atrophy. Atrophy was not improved in one of four studies with a large sample size (> 100 samples) and in two of five studies with a long follow-up period (> 12 months), suggesting that disagreement between the studies was not totally due to sample size or follow-up period. Methodological flaws, such as patient selection, and statistical analysis based on the assumption that atrophy improves continuously and generally in all patients might be responsible for the inconsistent results. Four of 28 studies described significant improvement of intestinal metaplasia [corrected]. Activity and inflammation were improved after the cure of H. pylori infection. Atrophy did not improve generally among all patients, but improved in certain patients. Improvement of intestinal metaplasia was difficult to analyse due to methodological problems including statistical power.
The validation of the Z-Scan technique for the determination of plasma glucose
NASA Astrophysics Data System (ADS)
Alves, Sarah I.; Silva, Elaine A. O.; Costa, Simone S.; Sonego, Denise R. N.; Hallack, Maira L.; Coppini, Ornela L.; Rowies, Fernanda; Azzalis, Ligia A.; Junqueira, Virginia B. C.; Pereira, Edimar C.; Rocha, Katya C.; Fonseca, Fernando L. A.
2013-11-01
Glucose is the main energy source for the human body. The concentration of blood glucose is regulated by several hormones including both antagonists: insulin and glucagon. The quantification of glucose in the blood is used for diagnosing metabolic disorders of carbohydrates, such as diabetes, idiopathic hypoglycemia and pancreatic diseases. Currently, the methodology used for this determination is the enzymatic colorimetric with spectrophotometric. This study aimed to validate the use of measurements of nonlinear optical properties of plasma glucose via the Z-Scan technique. For this we used samples of calibrator patterns that simulate commercial samples of patients (ELITech ©). Besides calibrators, serum glucose levels within acceptable reference values (normal control serum - Brazilian Society of Clinical Pathology and Laboratory Medicine) and also overestimated (pathological control serum - Brazilian Society of Clinical Pathology and Laboratory Medicine) were used in the methodology proposal. Calibrator dilutions were performed and determined by the Z-Scan technique for the preparation of calibration curve. In conclusion, Z-Scan method can be used to determinate glucose levels in biological samples with enzymatic colorimetric reaction and also to apply the same quality control parameters used in biochemistry clinical.
O'Maille, Grace; Go, Eden P.; Hoang, Linh; ...
2008-01-01
Comprehensive detection and quantitation of metabolites from a biological source constitute the major challenges of current metabolomics research. Two chemical derivatization methodologies, butylation and amination, were applied to human serum for ionization enhancement of a broad spectrum of metabolite classes, including steroids and amino acids. LC-ESI-MS analysis of the derivatized serum samples provided a significant signal elevation across the total ion chromatogram to over a 100-fold increase in ionization efficiency. It was also demonstrated that derivatization combined with isotopically labeled reagents facilitated the relative quantitation of derivatized metabolites from individual as well as pooled samples.
77 FR 7109 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... assay (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six... loss of the only commercially available antigen-detection ELISA filovirus testing facility. Currently... current methodology (ELISA) used to test NHP liver samples. This cost determines the amount of the user...
NASA Technical Reports Server (NTRS)
Scott, Elaine P.
1994-01-01
Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort was related to the development of the experimental techniques. Initial experiments required a resistance heater placed between two samples. The design was modified such that the heater was placed on the surface of only one sample, as would be necessary in the analysis of built up structures. Experiments using the modified technique were conducted on the composite sample used previously at different temperatures. The results were within 5 percent of those found using two samples. Finally, an initial heat transfer analysis, including conduction, convection and radiation components, was completed on a titanium sandwich structural sample. Experiments utilizing this sample are currently being designed and will be used to first estimate the material's effective thermal conductivity and later to determine the properties associated with each individual heat transfer component.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P
2018-01-01
Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.
Heller, Melina; Vitali, Luciano; Oliveira, Marcone Augusto Leal; Costa, Ana Carolina O; Micke, Gustavo Amadeu
2011-07-13
The present study aimed to develop a methodology using capillary electrophoresis for the determination of sinapaldehyde, syringaldehyde, coniferaldehyde, and vanillin in whiskey samples. The main objective was to obtain a screening method to differentiate authentic samples from seized samples suspected of being false using the phenolic aldehydes as chemical markers. The optimized background electrolyte was composed of 20 mmol L(-1) sodium tetraborate with 10% MeOH at pH 9.3. The study examined two kinds of sample stacking, using a long-end injection mode: normal sample stacking (NSM) and sample stacking with matrix removal (SWMR). In SWMR, the optimized injection time of the samples was 42 s (SWMR42); at this time, no matrix effects were observed. Values of r were >0.99 for the both methods. The LOD and LOQ were better than 100 and 330 mg mL(-1) for NSM and better than 22 and 73 mg L(-1) for SWMR. The CE-UV reliability in the aldehyde analysis in the real sample was compared statistically with LC-MS/MS methodology, and no significant differences were found, with a 95% confidence interval between the methodologies.
Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R
2018-02-05
In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.
A New Method for Generating Probability Tables in the Unresolved Resonance Region
Holcomb, Andrew M.; Leal, Luiz C.; Rahnema, Farzad; ...
2017-04-18
One new method for constructing probability tables in the unresolved resonance region (URR) has been developed. This new methodology is an extensive modification of the single-level Breit-Wigner (SLBW) pseudo-resonance pair sequence method commonly used to generate probability tables in the URR. The new method uses a Monte Carlo process to generate many pseudo-resonance sequences by first sampling the average resonance parameter data in the URR and then converting the sampled resonance parameters to the more robust R-matrix limited (RML) format. Furthermore, for each sampled set of pseudo-resonance sequences, the temperature-dependent cross sections are reconstructed on a small grid around themore » energy of reference using the Reich-Moore formalism and the Leal-Hwang Doppler broadening methodology. We then use the effective cross sections calculated at the energies of reference to construct probability tables in the URR. The RML cross-section reconstruction algorithm has been rigorously tested for a variety of isotopes, including 16O, 19F, 35Cl, 56Fe, 63Cu, and 65Cu. The new URR method also produced normalized cross-section factor probability tables for 238U that were found to be in agreement with current standards. The modified 238U probability tables were shown to produce results in excellent agreement with several standard benchmarks, including the IEU-MET-FAST-007 (BIG TEN), IEU-MET-FAST-003, and IEU-COMP-FAST-004 benchmarks.« less
Bastarrachea, Raúl A; López-Alvarenga, Juan Carlos; Kent, Jack W; Laviada-Molina, Hugo A; Cerda-Flores, Ricardo M; Calderón-Garcidueñas, Ana Laura; Torres-Salazar, Amada; Torres-Salazar, Amanda; Nava-González, Edna J; Solis-Pérez, Elizabeth; Gallegos-Cabrales, Esther C; Cole, Shelley A; Comuzzie, Anthony G
2008-01-01
We describe the methodology used to analyze multiple transcripts using microarray techniques in simultaneous biopsies of muscle, adipose tissue and lymphocytes obtained from the same individual as part of the standard protocol of the Genetics of Metabolic Diseases in Mexico: GEMM Family Study. We recruited 4 healthy male subjects with BM1 20-41, who signed an informed consent letter. Subjects participated in a clinical examination that included anthropometric and body composition measurements, muscle biopsies (vastus lateralis) subcutaneous fat biopsies anda blood draw. All samples provided sufficient amplified RNA for microarray analysis. Total RNA was extracted from the biopsy samples and amplified for analysis. Of the 48,687 transcript targets queried, 39.4% were detectable in a least one of the studied tissues. Leptin was not detectable in lymphocytes, weakly expressed in muscle, but overexpressed and highly correlated with BMI in subcutaneous fat. Another example was GLUT4, which was detectable only in muscle and not correlated with BMI. Expression level concordance was 0.7 (p< 0.001) for the three tissues studied. We demonstrated the feasibility of carrying out simultaneous analysis of gene expression in multiple tissues, concordance of genetic expression in different tissues, and obtained confidence that this method corroborates the expected biological relationships among LEPand GLUT4. TheGEMM study will provide a broad and valuable overview on metabolic diseases, including obesity and type 2 diabetes.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-01-01
Introduction It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. Methods and analysis We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. Ethics and dissemination The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal. PMID:24568962
Batchu, Sudha Rani; Ramirez, Cesar E; Gardinali, Piero R
2015-05-01
Because of its widespread consumption and its persistence during wastewater treatment, the artificial sweetener sucralose has gained considerable interest as a proxy to detect wastewater intrusion into usable water resources. The molecular resilience of this compound dictates that coastal and oceanic waters are the final recipient of this compound with unknown effects on ecosystems. Furthermore, no suitable methodologies have been reported for routine, ultra-trace detection of sucralose in seawater as the sensitivity of traditional liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis is limited by a low yield of product ions upon collision-induced dissociation (CID). In this work, we report the development and field test of an alternative analysis tool for sucralose in environmental waters, with enough sensitivity for the proper quantitation and confirmation of this analyte in seawater. The methodology is based on automated online solid-phase extraction (SPE) and high-resolving-power orbitrap MS detection. Operating in full scan (no CID), detection of the unique isotopic pattern (100:96:31 for [M-H](-), [M-H+2](-), and [M-H+4](-), respectively) was used for ultra-trace quantitation and analyte identification. The method offers fast analysis (14 min per run) and low sample consumption (10 mL per sample) with method detection and confirmation limits (MDLs and MCLs) of 1.4 and 5.7 ng/L in seawater, respectively. The methodology involves low operating costs due to virtually no sample preparation steps or consumables. As an application example, samples were collected from 17 oceanic and estuarine sites in Broward County, FL, with varying salinity (6-40 PSU). Samples included the ocean outfall of the Southern Regional Wastewater Treatment Plant (WWTP) that serves Hollywood, FL. Sucralose was detected above MCL in 78% of the samples at concentrations ranging from 8 to 148 ng/L, with the exception of the WWTP ocean outfall (at pipe end, 28 m below the surface) where the measured concentration was 8418 ± 3813 ng/L. These results demonstrate the applicability of this monitoring tool for the trace-level detection of this wastewater marker in very dilute environmental waters.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Guidelines for Initiating a Research Agenda: Research Design and Dissemination of Results.
Delost, Maria E; Nadder, Teresa S
2014-01-01
Successful research outcomes require selection and implementation of the appropriate research design. A realistic sampling plan appropriate for the design is essential. Qualitative or quantitative methodology may be utilized, depending on the research question and goals. Quantitative research may be experimental where there is an intervention, or nonexperimental, if no intervention is included in the design. Causation can only be established with experimental research. Popular types of nonexperimental research include descriptive and survey research. Research findings may be disseminated via presentations, posters, and publications, such as abstracts and manuscripts.
Respondent-Driven Sampling: An Assessment of Current Methodology.
Gile, Krista J; Handcock, Mark S
2010-08-01
Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.
Autism and family home movies: a comprehensive review.
Palomo, Rubén; Belinchón, Mercedes; Ozonoff, Sally
2006-04-01
In this article, we focus on the early development of autism studied through family home movies. We review all investigations published in English that met specific methodological standards, including the use of comparison samples, coding blind to group membership, and adequate levels of interrater reliability. After discussing in detail the pros and cons of the home-movie methodology, we review the results of all empirical studies conducted to date. We then present a summary of the features found consistently across studies that differentiate autism from typical development and mental retardation in the first 2 years of life. How family home movies can contribute to our understanding of the regression phenomenon is also addressed. Finally, the results are interpreted from both a theoretical and clinical point of view.
Comparison of historical documents for writership
NASA Astrophysics Data System (ADS)
Ball, Gregory R.; Pu, Danjun; Stritmatter, Roger; Srihari, Sargur N.
2010-01-01
Over the last century forensic document science has developed progressively more sophisticated pattern recognition methodologies for ascertaining the authorship of disputed documents. These include advances not only in computer assisted stylometrics, but forensic handwriting analysis. We present a writer verification method and an evaluation of an actual historical document written by an unknown writer. The questioned document is compared against two known handwriting samples of Herman Melville, a 19th century American author who has been hypothesized to be the writer of this document. The comparison led to a high confidence result that the questioned document was written by the same writer as the known documents. Such methodology can be applied to many such questioned documents in historical writing, both in literary and legal fields.
Holland, Christine M; Ritchie, Natalie D; Du Bois, Steve N
2015-10-01
This brief report describes methodology and results of a novel, efficient, and low-cost recruitment tool to engage high-risk MSM in online research. We developed an incentivization protocol using iTunes song-gifting to encourage participation of high-risk MSM in an Internet-based survey of HIV status, childhood sexual abuse, and adult behavior and functioning. Our recruitment methodology yielded 489 participants in 4.5 months at a total incentive cost of $1.43USD per participant. The sample comprised a critically high-risk group of MSM, including 71.0 % who reported recent condomless anal intercourse. We offer a "how-to" guide to aid future investigators in using iTunes song-gifting incentives.
Park, Bongki; Noh, Hyeonseok; Choi, Dong-Jun
2018-06-01
Xerostomia (dry mouth) causes many clinical problems, including oral infections, speech difficulties, and impaired chewing and swallowing of food. Many cancer patients have complained of xerostomia induced by cancer therapy. The aim of this systematic review is to assess the efficacy of herbal medicine for the treatment of xerostomia in cancer patients. Randomized controlled trials investigating the use of herbal medicines to treat xerostomia in cancer patients were included. We searched the following 12 databases without restrictions on time or language. The risk of bias was assessed using the Cochrane Risk of Bias Tool. Twenty-five randomized controlled trials involving 1586 patients met the inclusion criteria. A total of 24 formulas were examined in the included trials. Most of the included trials were insufficiently reported in the methodology section. Five formulas were shown to significantly improve the salivary flow rate compared to comparators. Regarding the grade of xerostomia, all formulas with the exception of a Dark Plum gargle solution with normal saline were significantly effective in reducing the severity of dry mouth. Adverse events were reported in 4 trials, and adverse effects of herbal medicine were reported in 3 trials. We found herbal medicines had potential benefits for improving salivary function and reducing the severity of dry mouth in cancer patients. However, methodological limitations and a relatively small sample size reduced the strength of the evidence. More high-quality trials reporting sufficient methodological data are warranted to enforce the strength of evidence regarding the effectiveness of herbal medicines.
Report of AAPM Task Group 162: Software for planar image quality metrology.
Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J
2018-02-01
The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.
Methodological quality of systematic reviews on treatments for depression: a cross-sectional study.
Chung, V C H; Wu, X Y; Feng, Y; Ho, R S T; Wong, S Y S; Threapleton, D
2017-05-02
Depression is one of the most common mental disorders and identifying effective treatment strategies is crucial for the control of depression. Well-conducted systematic reviews (SRs) and meta-analyses can provide the best evidence for supporting treatment decision-making. Nevertheless, the trustworthiness of conclusions can be limited by lack of methodological rigour. This study aims to assess the methodological quality of a representative sample of SRs on depression treatments. A cross-sectional study on the bibliographical and methodological characteristics of SRs published on depression treatments trials was conducted. Two electronic databases (the Cochrane Database of Systematic Reviews and the Database of Abstracts of Reviews of Effects) were searched for potential SRs. SRs with at least one meta-analysis on the effects of depression treatments were considered eligible. The methodological quality of included SRs was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. The associations between bibliographical characteristics and scoring on AMSTAR items were analysed using logistic regression analysis. A total of 358 SRs were included and appraised. Over half of included SRs (n = 195) focused on non-pharmacological treatments and harms were reported in 45.5% (n = 163) of all studies. Studies varied in methods and reporting practices: only 112 (31.3%) took the risk of bias among primary studies into account when formulating conclusions; 245 (68.4%) did not fully declare conflict of interests; 93 (26.0%) reported an 'a priori' design and 104 (29.1%) provided lists of both included and excluded studies. Results from regression analyses showed: more recent publications were more likely to report 'a priori' designs [adjusted odds ratio (AOR) 1.31, 95% confidence interval (CI) 1.09-1.57], to describe study characteristics fully (AOR 1.16, 95% CI 1.06-1.28), and to assess presence of publication bias (AOR 1.13, 95% CI 1.06-1.19), but were less likely to list both included and excluded studies (AOR 0.86, 95% CI 0.81-0.92). SRs published in journals with higher impact factor (AOR 1.14, 95% CI 1.04-1.25), completed by more review authors (AOR 1.12, 95% CI 1.01-1.24) and SRs on non-pharmacological treatments (AOR 1.62, 95% CI 1.01-2.59) were associated with better performance in publication bias assessment. The methodological quality of included SRs is disappointing. Future SRs should strive to improve rigour by considering of risk of bias when formulating conclusions, reporting conflict of interests and authors should explicitly describe harms. SR authors should also use appropriate methods to combine the results, prevent language and publication biases, and ensure timely updates.
NASA Technical Reports Server (NTRS)
Francois, J.
1981-01-01
The effects of aircraft noise on humans living near airports were studied. Two main questions were considered: do residents give evidence of psychological or physiological disturbances in unusually intense noise sectors; and do personality or health factors account for the high interindividual variability of annoyance? The methodology used and results obtained are presented. Samples of the survey questionnaires are included.
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
Brunovskis, Anette; Surtees, Rebecca
2010-01-01
Recent discussions of trafficking research have included calls for more innovative studies and new methodologies in order to move beyond the current trafficking narrative, which is often based on unrepresentative samples and overly simplified images. While new methods can potentially play a role in expanding the knowledge base on trafficking, this article argues that the solution is not entirely about applying new methods, but as much about using current methods to greater effect and with careful attention to their limitations and ethical constraints. Drawing on the authors' experience in researching trafficking issues in a number of projects over the past decade, the article outlines and exemplifies some of the methodological and ethical issues to be considered and accommodated when conducting research with trafficked persons -- including unrepresentative samples; access to respondents; selection biases by "gatekeepers" and self selection by potential respondents. Such considerations should inform not only how research is undertaken but also how this information is read and understood. Moreover, many of these considerations equally apply when considering the application of new methods within this field. The article maintains that a better understanding of how these issues come into play and inform trafficking research will translate into tools for conducting improved research in this field and, by implication, new perspectives on human trafficking.
Cryptosporidium as a testbed for single cell genome characterization of unicellular eukaryotes.
Troell, Karin; Hallström, Björn; Divne, Anna-Maria; Alsmark, Cecilia; Arrighi, Romanico; Huss, Mikael; Beser, Jessica; Bertilsson, Stefan
2016-06-23
Infectious disease involving multiple genetically distinct populations of pathogens is frequently concurrent, but difficult to detect or describe with current routine methodology. Cryptosporidium sp. is a widespread gastrointestinal protozoan of global significance in both animals and humans. It cannot be easily maintained in culture and infections of multiple strains have been reported. To explore the potential use of single cell genomics methodology for revealing genome-level variation in clinical samples from Cryptosporidium-infected hosts, we sorted individual oocysts for subsequent genome amplification and full-genome sequencing. Cells were identified with fluorescent antibodies with an 80 % success rate for the entire single cell genomics workflow, demonstrating that the methodology can be applied directly to purified fecal samples. Ten amplified genomes from sorted single cells were selected for genome sequencing and compared both to the original population and a reference genome in order to evaluate the accuracy and performance of the method. Single cell genome coverage was on average 81 % even with a moderate sequencing effort and by combining the 10 single cell genomes, the full genome was accounted for. By a comparison to the original sample, biological variation could be distinguished and separated from noise introduced in the amplification. As a proof of principle, we have demonstrated the power of applying single cell genomics to dissect infectious disease caused by closely related parasite species or subtypes. The workflow can easily be expanded and adapted to target other protozoans, and potential applications include mapping genome-encoded traits, virulence, pathogenicity, host specificity and resistance at the level of cells as truly meaningful biological units.
Roberts, Tonya; Nolet, Kimberly; Bowers, Barbara
2015-06-01
Consistent assignment of nursing staff to residents is promoted by a number of national organizations as a strategy for improving nursing home quality and is included in pay for performance schedules in several states. However, research has shown inconsistent effects of consistent assignment on quality outcomes. In order to advance the state of the science of research on consistent assignment and inform current practice and policy, a literature review was conducted to critique conceptual and methodological understandings of consistent assignment. Twenty original research reports of consistent assignment in nursing homes were found through a variety of search strategies. Consistent assignment was conceptualized and operationalized in multiple ways with little overlap from study to study. There was a lack of established methods to measure consistent assignment. Methodological limitations included a lack of control and statistical analyses of group differences in experimental-level studies, small sample sizes, lack of attention to confounds in multicomponent interventions, and outcomes that were not theoretically linked. Future research should focus on developing a conceptual understanding of consistent assignment focused on definition, measurement, and links to outcomes. To inform current policies, testing consistent assignment should include attention to contexts within and levels at which it is most effective. Published by Oxford University Press on behalf of the Gerontological Society of America 2013.
The Fungal Frontier: A Comparative Analysis of Methods Used in the Study of the Human Gut Mycobiome.
Huseyin, Chloe E; Rubio, Raul Cabrera; O'Sullivan, Orla; Cotter, Paul D; Scanlan, Pauline D
2017-01-01
The human gut is host to a diverse range of fungal species, collectively referred to as the gut "mycobiome". The gut mycobiome is emerging as an area of considerable research interest due to the potential roles of these fungi in human health and disease. However, there is no consensus as to what the best or most suitable methodologies available are with respect to characterizing the human gut mycobiome. The aim of this study is to provide a comparative analysis of several previously published mycobiome-specific culture-dependent and -independent methodologies, including choice of culture media, incubation conditions (aerobic versus anaerobic), DNA extraction method, primer set and freezing of fecal samples to assess their relative merits and suitability for gut mycobiome analysis. There was no significant effect of media type or aeration on culture-dependent results. However, freezing was found to have a significant effect on fungal viability, with significantly lower fungal numbers recovered from frozen samples. DNA extraction method had a significant effect on DNA yield and quality. However, freezing and extraction method did not have any impact on either α or β diversity. There was also considerable variation in the ability of different fungal-specific primer sets to generate PCR products for subsequent sequence analysis. Through this investigation two DNA extraction methods and one primer set was identified which facilitated the analysis of the mycobiome for all samples in this study. Ultimately, a diverse range of fungal species were recovered using both approaches, with Candida and Saccharomyces identified as the most common fungal species recovered using culture-dependent and culture-independent methods, respectively. As has been apparent from ecological surveys of the bacterial fraction of the gut microbiota, the use of different methodologies can also impact on our understanding of gut mycobiome composition and therefore requires careful consideration. Future research into the gut mycobiome needs to adopt a common strategy to minimize potentially confounding effects of methodological choice and to facilitate comparative analysis of datasets.
2013-01-01
Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257
Trippi, Michael H.; Belkin, Harvey E.
2015-09-10
Geographic information system (GIS) information may facilitate energy studies, which in turn provide input for energy policy decisions. The U.S. Geological Survey (USGS) has compiled GIS data representing coal mines, deposits (including those with and without coal mines), occurrences, areas, basins, and provinces of Mongolia as of 2009. These data are now available for download, and may be used in a GIS for a variety of energy resource and environmental studies of Mongolia. Chemical data for 37 coal samples from a previous USGS study of Mongolia (Tewalt and others, 2010) are included in a downloadable GIS point shapefile and shown on the map of Mongolia. A brief report summarizes the methodology used for creation of the shapefiles and the chemical analyses run on the samples.
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P; Kumar, Ambuj
2012-06-01
To assess whether the reported methodological quality of randomized controlled trials (RCTs) reflects the actual methodological quality and to evaluate the association of effect size (ES) and sample size with methodological quality. Systematic review. This is a retrospective analysis of all consecutive phase III RCTs published by eight National Cancer Institute Cooperative Groups up to 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Four hundred twenty-nine RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94; 95% confidence interval [CI]: 0.88, 0.99) and 24% (RHR: 1.24; 95% CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. The largest study to date shows that poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. Copyright © 2012 Elsevier Inc. All rights reserved.
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
Insights from two industrial hygiene pilot e-cigarette passive vaping studies.
Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A
2016-01-01
While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.
A call to improve sampling methodology and reporting in young novice driver research.
Scott-Parker, B; Senserrick, T
2017-02-01
Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Ramírez-Silva, Ivonne; Jiménez-Aguilar, Alejandra; Valenzuela-Bravo, Danae; Martinez-Tapia, Brenda; Rodríguez-Ramírez, Sonia; Gaona-Pineda, Elsa Berenice; Angulo-Estrada, Salomón; Shamah-Levy, Teresa
2016-01-01
To describe the methodology used to clean up and estimate dietary intake (DI) data from the Semi-Quantitative Food Frequency Questionnaire (SFFQ) of the Mexican National Health and Nutrition Survey 2012. DI was collected through a shortterm SFFQ regarding 140 foods (from October 2011 to May 2012). Energy and nutrient intake was calculated according to a nutrient database constructed specifically for the SFFQ. A total of 133 nutrients including energy and fiber were generated from SFFQ data. Between 4.8 and 9.6% of the survey sample was excluded as a result of the cleaning process.Valid DI data were obtained regarding energy and nutrients consumed by 1 212 pre-school children, 1 323 school children, 1 961 adolescents, 2 027 adults and 526 older adults. We documented the methodology used to clean up and estimate DI from the SFFQ used in national dietary assessments in Mexico.
Post-Traumatic Stress Symptoms in Post-ICU Family Members: Review and Methodological Challenges.
Petrinec, Amy B; Daly, Barbara J
2016-01-01
Family members of intensive care unit (ICU) patients are at risk for symptoms of post-traumatic stress disorder (PTSD) following ICU discharge. The aim of this systematic review is to examine the current literature regarding post-ICU family PTSD symptoms with an emphasis on methodological issues in conducting research on this challenging phenomenon. An extensive review of the literature was performed confining the search to English language studies reporting PTSD symptoms in adult family members of adult ICU patients. Ten studies were identified for review published from 2004 to 2012. Findings demonstrate a significant prevalence of family PTSD symptoms in the months following ICU hospitalization. However, there are several methodological challenges to the interpretation of existing studies and to the conduct of future research including differences in sampling, identification of risk factors and covariates of PTSD, and lack of consensus regarding the most appropriate PTSD symptom measurement tools and timing. © The Author(s) 2014.
Dipnall, Joanna F.
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin. PMID:26848571
Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny
2016-01-01
Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin.
NASA Astrophysics Data System (ADS)
Zhu, Wei; Udalski, A.; Calchi Novati, S.; Chung, S.-J.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Gould, A.; Lee, C.-U.; Albrow, M. D.; Yee, J. C.; Han, C.; Hwang, K.-H.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Kim, Y.-H.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration; Poleski, R.; Mróz, P.; Pietrukowicz, P.; Skowron, J.; Szymański, M. K.; KozLowski, S.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Wibking, B.; Spitzer Team
2017-11-01
We analyze an ensemble of microlensing events from the 2015 Spitzer microlensing campaign, all of which were densely monitored by ground-based high-cadence survey teams. The simultaneous observations from Spitzer and the ground yield measurements of the microlensing parallax vector {{\\boldsymbol{π }}}{{E}}, from which compact constraints on the microlens properties are derived, including ≲25% uncertainties on the lens mass and distance. With the current sample, we demonstrate that the majority of microlenses are indeed in the mass range of M dwarfs. The planet sensitivities of all 41 events in the sample are calculated, from which we provide constraints on the planet distribution function. In particular, assuming a planet distribution function that is uniform in {log}q, where q is the planet-to-star mass ratio, we find a 95% upper limit on the fraction of stars that host typical microlensing planets of 49%, which is consistent with previous studies. Based on this planet-free sample, we develop the methodology to statistically study the Galactic distribution of planets using microlensing parallax measurements. Under the assumption that the planet distributions are the same in the bulge as in the disk, we predict that ∼1/3 of all planet detections from the microlensing campaigns with Spitzer should be in the bulge. This prediction will be tested with a much larger sample, and deviations from it can be used to constrain the abundance of planets in the bulge relative to the disk.
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
Liese, Angela D; Crandell, Jamie L; Tooze, Janet A; Kipnis, Victor; Bell, Ronny; Couch, Sarah C; Dabelea, Dana; Crume, Tessa L; Mayer-Davis, Elizabeth J
2015-08-14
The SEARCH Nutrition Ancillary Study aims to investigate the role of dietary intake on the development of long-term complications of type 1 diabetes in youth, and capitalise on measurement error (ME) adjustment methodology. Using the National Cancer Institute (NCI) method for episodically consumed foods, we evaluated the relationship between sugar-sweetened beverage (SSB) intake and cardiovascular risk factor profile, with the application of ME adjustment methodology. The calibration sample included 166 youth with two FFQ and three 24 h dietary recall data within 1 month. The full sample included 2286 youth with type 1 diabetes. SSB intake was significantly associated with higher TAG, total and LDL-cholesterol concentrations, after adjusting for energy, age, diabetes duration, race/ethnicity, sex and education. The estimated effect size was larger (model coefficients increased approximately 3-fold) after the application of the NCI method than without adjustment for ME. Compared with individuals consuming one serving of SSB every 2 weeks, those who consumed one serving of SSB every 2 d had 3.7 mg/dl (0.04 mmol/l) higher TAG concentrations and 4.0 mg/dl (0.10 mmol/l) higher total cholesterol and LDL-cholesterol concentrations, after adjusting for ME and covariates. SSB intake was not associated with measures of adiposity and blood pressure. Our findings suggest that SSB intake is significantly related to increased lipid levels in youth with type 1 diabetes, and that estimates of the effect size of SSB on lipid levels are severely attenuated in the presence of ME. Future studies in youth with diabetes should consider a design that will allow for the adjustment for ME when studying the influence of diet on health status.
Bellamy, Kim; Ostini, Remo; Martini, Nataly; Kairuz, Therese
2016-06-01
Introduction There are challenges associated with selecting a qualitative research approach. In a field abundant with terminology and theories, it may be difficult for a pharmacist to know where and how to begin a qualitative research journey. The purpose of this paper is to provide insight into generic qualitative research and to describe the journey of data collection of a novice qualitative researcher in the quest to answer her research question: 'What are the barriers to accessing medicines and pharmacy services for resettled refugees in Queensland, Australia?' Methodology Generic qualitative research draws on the strengths of one or more qualitative approaches. The aim is to draw out participants' ideas about things that are 'outside themselves'; rather than focussing on their inner feelings the research seeks to understand a phenomenon, a process, or the perspectives of participants. Sampling is designed to obtain a broad range of opinions about events and experiences and data collection includes interviews, questionnaires or surveys; thematic analysis is often used to analyse data. When to use Generic qualitative research provides an opportunity to develop research designs that fit researchers' epistemological stance and discipline, with research choices, including methodology and methods, being informed by the research question. Limitations Generic qualitative research is one of many methodologies that may be used to answer a research question and there is a paucity of literature about how to do it well. There is also debate about its validity as a qualitative methodology.
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
1987-08-01
out. To use each animal as its own control , arterial blood was sampled by means of chronically implanted aortic cannulas 112,13,14]. This simple...APPENDIX B STATISTICAL METHODOLOGY 37 APPENDIX B STATISTICAL METHODOLOGY The balanced design of this experiment (requiring that 25 animals from each...protoccl in that, in numerous cases, samples were collected at odd intervals (invalidating the orthogonality of the design ) and the number of samples’taken
Li, Honghe; Liu, Yang; Wen, Deliang
2017-01-01
Background Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments’ measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. Methods A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990–2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument’s usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee’s criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. Results After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar’s instrument for nursing students, Nurse Practitioners’ Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Conclusion Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies. PMID:28498838
Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food
NASA Astrophysics Data System (ADS)
Lari, L.; Dudkiewicz, A.
2014-06-01
Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility.
Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.
Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe
2017-12-27
The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.
Sexuality Research in Iran: A Focus on Methodological and Ethical Considerations.
Rahmani, Azam; Merghati-Khoei, Effat; Moghaddam-Banaem, Lida; Zarei, Fatemeh; Montazeri, Ali; Hajizadeh, Ebrahim
2015-07-01
Research on sensitive topics, such as sexuality, could raise technical, methodological, ethical, political, and legal challenges. The aim of this paper was to draw the methodological challenges which the authors confronted during sexuality research with young population in the Iranian culture. This study was an exploratory mixed method one conducted in 2013-14. We interviewed 63 young women aged 18-34 yr in qualitative phase and 265 young women in quantitative phase in (university and non-university) dormitories and in an Adolescent Friendly Center. Data were collected using focus group discussions and individual interviews in the qualitative phase. We employed conventional content analysis to analyze the data. To enhance the rigor of the data, multiple data collection methods, maximum variation sampling, and peer checks were applied. Five main themes emerged from the data: interaction with opposite sex, sexual risk, sexual protective, sex education, and sexual vulnerability. Challenges while conducting sex research have been discussed. These challenges included assumption of promiscuity, language of silence and privacy concerns, and sex segregation policy. We described the strategies applied in our study and the rationales for each strategy. Strategies applied in the present study can be employed in contexts with the similar methodological and moral concerns.
Sensory re-education after nerve injury of the upper limb: a systematic review.
Oud, Tanja; Beelen, Anita; Eijffinger, Elianne; Nollet, Frans
2007-06-01
To systematically review the available evidence for the effectiveness of sensory re-education to improve the sensibility of the hand in patients with a peripheral nerve injury of the upper limb. Studies were identified by an electronic search in the databases MEDLINE, Cumulative Index to Nursing & Allied Health Literature (CINAHL), EMBASE, the Cochrane Library, the Physiotherapy Evidence Database (PEDro), and the database of the Dutch National Institute of Allied Health Professions (Doconline) and by screening the reference lists of relevant articles. Two reviewers selected studies that met the following inclusion criteria: all designs except case reports, adults with impaired sensibility of the hand due to a peripheral nerve injury of the upper limb, and sensibility and functional sensibility as outcome measures. The methodological quality of the included studies was independently assessed by two reviewers. A best-evidence synthesis was performed, based on design, methodological quality and significant findings on outcome measures. Seven studies, with sample sizes ranging from 11 to 49, were included in the systematic review and appraised for content. Five of these studies were of poor methodological quality. One uncontrolled study (N = 1 3 ) was considered to be of sufficient methodological quality, and one randomized controlled trial (N = 49) was of high methodological quality. Best-evidence synthesis showed that there is limited evidence for the effectiveness of sensory re-education, provided by a statistically significant improvement in sensibility found in one high-quality randomized controlled trial. There is a need for further well-defined clinical trials to assess the effectiveness of sensory re-education of patients with impaired sensibility of the hand due to a peripheral nerve injury.
Disordered Gambling Prevalence: Methodological Innovations in a General Danish Population Survey.
Harrison, Glenn W; Jessen, Lasse J; Lau, Morten I; Ross, Don
2018-03-01
We study Danish adult gambling behavior with an emphasis on discovering patterns relevant to public health forecasting and economic welfare assessment of policy. Methodological innovations include measurement of formative in addition to reflective constructs, estimation of prospective risk for developing gambling disorder rather than risk of being falsely negatively diagnosed, analysis with attention to sample weights and correction for sample selection bias, estimation of the impact of trigger questions on prevalence estimates and sample characteristics, and distinguishing between total and marginal effects of risk-indicating factors. The most significant novelty in our design is that nobody was excluded on the basis of their response to a 'trigger' or 'gateway' question about previous gambling history. Our sample consists of 8405 adult Danes. We administered the Focal Adult Gambling Screen to all subjects and estimate prospective risk for disordered gambling. We find that 87.6% of the population is indicated for no detectable risk, 5.4% is indicated for early risk, 1.7% is indicated for intermediate risk, 2.6% is indicated for advanced risk, and 2.6% is indicated for disordered gambling. Correcting for sample weights and controlling for sample selection has a significant effect on prevalence rates. Although these estimates of the 'at risk' fraction of the population are significantly higher than conventionally reported, we infer a significant decrease in overall prevalence rates of detectable risk with these corrections, since gambling behavior is positively correlated with the decision to participate in gambling surveys. We also find that imposing a threshold gambling history leads to underestimation of the prevalence of gambling problems.
Harvey, J.B.J.; Hoy, M.S.; Rodriguez, R.J.
2009-01-01
Non-native marine species have been and continue to be introduced into Puget Sound via several vectors including ship's ballast water. Some non-native species become invasive and negatively impact native species or near shore habitats. We present a new methodology for the development and testing of taxon specific PCR primers designed to assess environmental samples of ocean water for the presence of native and non-native bivalves, crustaceans and algae. The intergenic spacer regions (IGS; ITS1, ITS2 and 5.8S) of the ribosomal DNA were sequenced for adult samples of each taxon studied. We used these data along with those available in Genbank to design taxon and group specific primers and tested their stringency against artificial populations of plasmid constructs containing the entire IGS region for each of the 25 taxa in our study, respectively. Taxon and group specific primer sets were then used to detect the presence or absence of native and non-native planktonic life-history stages (propagules) from environmental samples of ballast water and plankton tow net samples collected in Puget Sound. This methodology provides an inexpensive and efficient way to test the discriminatory ability of taxon specific oligonucleotides (PCR primers) before creating molecular probes or beacons for use in molecular ecological applications such as probe hybridizations or microarray analyses. This work addresses the current need to develop molecular tools capable of diagnosing the presence of planktonic life-history stages from non-native marine species (potential invaders) in ballast water and other environmental samples. ?? 2008 Elsevier B.V.
Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason
2001-01-01
Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...
Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P
2018-01-01
Background Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test’s performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. Methods and findings We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Cholrapid and Cholgold) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). Conclusion No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard. PMID:29387424
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-23
... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...
Critical Thinking: Comparing Instructional Methodologies in a Senior-Year Learning Community
ERIC Educational Resources Information Center
Zelizer, Deborah A.
2013-01-01
This quasi-experimental, nonequivalent control group study compared the impact of Ennis's (1989) mixed instructional methodology to the immersion methodology on the development of critical thinking in a multicultural, undergraduate senior-year learning community. A convenience sample of students (n =171) were selected from four sections of a…
Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven
2010-08-01
The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.
Hawkley, Gavin
2014-12-01
Atmospheric dispersion modeling within the near field of a nuclear facility typically applies a building wake correction to the Gaussian plume model, whereby a point source is modeled as a plane source. The plane source results in greater near field dilution and reduces the far field effluent concentration. However, the correction does not account for the concentration profile within the near field. Receptors of interest, such as the maximally exposed individual, may exist within the near field and thus the realm of building wake effects. Furthermore, release parameters and displacement characteristics may be unknown, particularly during upset conditions. Therefore, emphasis is placed upon the need to analyze and estimate an enveloping concentration profile within the near field of a release. This investigation included the analysis of 64 air samples collected over 128 wk. Variables of importance were then derived from the measurement data, and a methodology was developed that allowed for the estimation of Lorentzian-based dispersion coefficients along the lateral axis of the near field recirculation cavity; the development of recirculation cavity boundaries; and conservative evaluation of the associated concentration profile. The results evaluated the effectiveness of the Lorentzian distribution methodology for estimating near field releases and emphasized the need to place air-monitoring stations appropriately for complete concentration characterization. Additionally, the importance of the sampling period and operational conditions were discussed to balance operational feedback and the reporting of public dose.
Fuller, Daniel; Gauvin, Lise; Dubé, Anne-Sophie; Winters, Meghan; Teschke, Kay; Russo, Elizabeth T; Camden, Andi; Mee, Carol; Friedman, Steven Marc
2014-10-25
Few international studies examine public bicycle share programs (PBSP) health impacts. We describe the protocol for the International Bikeshare Impacts on Cycling and Collisions Study (IBICCS). A quasi-experimental non-equivalent groups design was used. Intervention cities (Montreal, Toronto, Boston, New York and Vancouver) were matched to control cities (Chicago, Detroit, and Philadelphia) on total population, population density, cycling rates, and average yearly temperature. The study used three repeated, cross-sectional surveys in intervention and control cities in Fall 2012 (baseline), 2013 (year 1), and 2014 (year 2). A non-probabilistic online panel survey with a sampling frame of individuals residing in and around areas where PBSP are/would be implemented was used. A total of 12,000 respondents will be sampled. In each of the 8 cities 1000 respondents will be sampled with an additional 4000 respondents sampled based on the total population of the city. Survey questions include measures of self-rated health, and self-reported height and weight, knowledge and experience using PBSP, physical activity, bicycle helmet use and history of collisions and injuries while cycling, socio-demographic questions, and home/workplace locations. Respondents could complete questionnaires in English, French, and Spanish. Two weights will be applied to the data: inverse probability of selection and post-stratification on age and sex.A triple difference analysis will be used. This approach includes in the models, time, exposure, and treatment group, and interaction terms between these variables to estimate changes across time, between exposure groups and between cities. There are scientific and practical challenges in evaluating PBSP. Methodological challenges included: appropriate sample recruitment, exchangeability of treatment and control groups, controlling unmeasured confounding, and specifying exposure. Practical challenges arise in the evaluation of environmental interventions such as a PBSP: one of the companies involved filed for bankruptcy, a Hurricane devastated New York City, and one PBSP was not implemented. Overall, this protocol provides methodological and practical guidance for researchers wanting to study PBSP impacts on health.
ERIC Educational Resources Information Center
Martin, Ian; Lauterbach, Alexandra; Carey, John
2015-01-01
A grounded theory methodology was used to analyze articles and book chapters describing the development and practice of school-based counseling in 25 different countries in order to identify the factors that affect development and practice. An 11-factor analytic framework was developed. Factors include: Cultural Factors, National Needs, Larger…
Measuring Substance Use and Misuse via Survey Research: Unfinished Business.
Johnson, Timothy P
2015-01-01
This article reviews unfinished business regarding the assessment of substance use behaviors by using survey research methodologies, a practice that dates back to the earliest years of this journal's publication. Six classes of unfinished business are considered including errors of sampling, coverage, non-response, measurement, processing, and ethics. It may be that there is more now that we do not know than when this work began some 50 years ago.
Thermal Inactivation of Bacillus anthracis Spores Using Rapid Resistive Heating
2016-03-24
thermal inactivation research. However, the research conducted to support this thesis utilizes the B.a. Sterne strain which is used in livestock vaccines...methodology conducted for this research including hard surface recovery, thermal inactivation of Bacillus anthracis spores, and the rapid resistive heating...to 500°C range but again, many of the thermal inactivation studies were conducted in the 350 to 2000°C range. Sample plots will be discussed in
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Manies, Kristen L.; Harden, Jennifer W.; Holingsworth, Teresa N.
2014-01-01
This report describes the collection and processing methodologies for samples obtained at two sites within Interior Alaska: (1) a location within the 2001 Survey Line burn, and (2) an unburned location, selected as a control. In 2002 and 2004 U.S. Geological Survey investigators measured soil properties including, but not limited to, bulk density, volumetric water content, carbon content, and nitrogen content from samples obtained from these sites. Stand properties, such as tree density, the amount of woody debris, and understory vegetation, were also measured and are presented in this report.
Whitmore, Roy W; Chen, Wenlin
2013-12-04
The ability to infer human exposure to substances from drinking water using monitoring data helps determine and/or refine potential risks associated with drinking water consumption. We describe a survey sampling approach and its application to an atrazine groundwater monitoring study to adequately characterize upper exposure centiles and associated confidence intervals with predetermined precision. Study design and data analysis included sampling frame definition, sample stratification, sample size determination, allocation to strata, analysis weights, and weighted population estimates. Sampling frame encompassed 15 840 groundwater community water systems (CWS) in 21 states throughout the U. S. Median, and 95th percentile atrazine concentrations were 0.0022 and 0.024 ppb, respectively, for all CWS. Statistical estimates agreed with historical monitoring results, suggesting that the study design was adequate and robust. This methodology makes no assumptions regarding the occurrence distribution (e.g., lognormality); thus analyses based on the design-induced distribution provide the most robust basis for making inferences from the sample to target population.
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2013-01-01
This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.
Representation of scientific methodology in secondary science textbooks
NASA Astrophysics Data System (ADS)
Binns, Ian C.
The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.
What Is the Methodologic Quality of Human Therapy Studies in ISI Surgical Publications?
Manterola, Carlos; Pineda, Viviana; Vial, Manuel; Losada, Héctor
2006-01-01
Objective: To determine the methodologic quality of therapy articles about humans published in ISI surgical journals, and to explore the association between methodologic quality, origin, and subject matter. Summary Background Data: It is supposed that ISI journals contain the best methodologic articles. Methods: This is a bibliometric study. All journals listed in the 2002 ISI under the subject heading of “Surgery” were included. A simple randomized sampling was conducted for selected journals (Annals of Surgery, The American Surgeon, Archives of Surgery, British Journal of Surgery, European Journal of Surgery, Journal of the American College of Surgeons, Surgery, and World Journal of Surgery). Published articles related to therapy on humans of the selected journals were reviewed and analyzed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to the editor, and experimental studies. The variables considered were: place of origin, design, and the methodologic quality of articles, which was determined by applying a valid and reliable scale. The review was performed interchangeably and independently by 2 research teams. Descriptive and analytical statistics were used. Statistical significance was defined as P values less than 1%. Results: A total of 653 articles were studied. Studies came predominantly from the United States and Europe (43.6% and 36.8%, respectively). The subject areas most frequently found were digestive and hepatobiliopancreatic surgery (29.1% and 24.5%, respectively). Average and median methodologic quality scores of the entire series were 11.6 ± 4.9 points and 11 points, respectively. The association between methodologic quality and journals was determined. Also, the association between methodologic quality and origin was observed, but no association with subject area was verified. Conclusions: The methodologic quality of therapy articles published in the journals analyzed is low; however, statistical significance was determined between them. Association was observed between methodologic quality and origin, but not with subject matter. PMID:17060778
Methodology to estimate particulate matter emissions from certified commercial aircraft engines.
Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph
2009-01-01
Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil Aviation Organization endorsed the use of FOA3.0 in February 2007. Further commitment was made to improve the FOA as new data become available, until such time the methodology is rendered obsolete by a fully validated database of PM emission indices for today's certified commercial fleet. This paper discusses related assumptions and derived equations for the FOA3.0 methodology used worldwide to estimate PM emissions from certified commercial aircraft engines within the vicinity of airports.
Versatile fluid-mixing device for cell and tissue microgravity research applications.
Wilfinger, W W; Baker, C S; Kunze, E L; Phillips, A T; Hammerstedt, R H
1996-01-01
Microgravity life-science research requires hardware that can be easily adapted to a variety of experimental designs and working environments. The Biomodule is a patented, computer-controlled fluid-mixing device that can accommodate these diverse requirements. A typical shuttle payload contains eight Biomodules with a total of 64 samples, a sealed containment vessel, and a NASA refrigeration-incubation module. Each Biomodule contains eight gas-permeable Silastic T tubes that are partitioned into three fluid-filled compartments. The fluids can be mixed at any user-specified time. Multiple investigators and complex experimental designs can be easily accommodated with the hardware. During flight, the Biomodules are sealed in a vessel that provides two levels of containment (liquids and gas) and a stable, investigator-controlled experimental environment that includes regulated temperature, internal pressure, humidity, and gas composition. A cell microencapsulation methodology has also been developed to streamline launch-site sample manipulation and accelerate postflight analysis through the use of fluorescent-activated cell sorting. The Biomodule flight hardware and analytical cell encapsulation methodology are ideally suited for temporal, qualitative, or quantitative life-science investigations.
Report: new guidelines for characterization of municipal solid waste: the Portuguese case.
da Graça Madeira Martinho, Maria; Silveira, Ana Isabel; Fernandes Duarte Branco, Elsa Maria
2008-10-01
This report proposes a new set of guidelines for the characterization of municipal solid waste. It is based on an analysis of reference methodologies, used internationally, and a case study of Valorsul (a company that handles recovery and treatment of solid waste in the North Lisbon Metropolitan Area). In particular, the suggested guidelines present a new definition of the waste to be analysed, change the sampling unit and establish statistical standards for the results obtained. In these new guidelines, the sampling level is the waste collection vehicle and contamination and moisture are taken into consideration. Finally, focus is on the quality of the resulting data, which is essential for comparability of data between countries. These new guidelines may also be applicable outside Portugal because the methodology includes, besides municipal mixed waste, separately collected fractions of municipal waste. They are a response to the need for information concerning Portugal (e.g. Eurostat or OECD inquiries) and follow European Union municipal solid waste management policies (e.g. packaging waste recovery and recycling targets and the reduction of biodegradable waste going to landfill).
The identification of fungi collected from the ceca of commercial poultry.
Byrd, J A; Caldwell, D Y; Nisbet, D J
2017-07-01
Under normal conditions, fungi are ignored unless a disease/syndrome clinical signs are reported. The scientific communities are largely unaware of the roles fungi play in normal production parameters. Numerous preharvest interventions have demonstrated that beneficial bacteria can play a role in improving productions parameters; however, most researchers have ignored the impact that fungi may have on production. The goal of the present study was to record fungi recovered from commercial broiler and layer houses during production. Over 3,000 cecal samples were isolated using conventional culture methodology and over 890 samples were further characterized using an automated repetitive sequence-based PCR (rep-PCR) methodology. Eighty-eight different fungal and yeast species were identified, including Aspergillus spp., Penicillium spp., and Sporidiobolus spp, and 18 unknown genera were separated using rep-PCR. The results from the present study will provide a normal fungi background genera under commercial conditions and will be a stepping stone for investigating the impact of fungi on the gastrointestinal tract and on the health of poultry. Published by Oxford University Press on behalf of Poultry Science Association 2017.
Quality of Reporting Nutritional Randomized Controlled Trials in Patients With Cystic Fibrosis.
Daitch, Vered; Babich, Tanya; Singer, Pierre; Leibovici, Leonard
2016-08-01
Randomized controlled trials (RCTs) have a major role in the making of evidence-based guidelines. The aim of the present study was to critically appraise the RCTs that addressed nutritional interventions in patients with cystic fibrosis. Embase, PubMed, and the Cochrane Library were systematically searched until July 2015. Methodology and reporting of nutritional RCTs were evaluated by the Consolidated Standards of Reporting Trials (CONSORT) checklist and additional dimensions relevant to patients with CF. Fifty-one RCTs were included. Full details on methods were provided in a minority of studies. The mean duration of intervention was <6 months. 56.9% of the RCTs did not define a primary outcome; 70.6% of studies did not provide details on sample size calculation; and only 31.4% reported on the subgroup or separated between important subgroups. The examined RCTs were characterized by a weak methodology, a small number of patients with no sample size calculations, a relatively short intervention, and many times did not examine the outcomes that are important to the patient. Improvement over the years has been minor.
Valenzuela, Aníbal; Lespes, Gaëtane; Quiroz, Waldo; Aguilar, Luis F; Bravo, Manuel A
2014-07-01
A new headspace solid-phase micro-extraction (HS-SPME) method followed by gas chromatography with pulsed flame photometric detection (GC-PFPD) analysis has been developed for the simultaneous determination of 11 organotin compounds, including methyl-, butyl-, phenyl- and octyltin derivates, in human urine. The methodology has been validated by the analysis of urine samples fortified with all analytes at different concentration levels, and recovery rates above 87% and relative precisions between 2% and 7% were obtained. Additionally, an experimental-design approach has been used to model the storage stability of organotin compounds in human urine, demonstrating that organotins are highly degraded in this medium, although their stability is satisfactory during the first 4 days of storage at 4 °C and pH=4. Finally, this methodology was applied to urine samples collected from harbor workers exposed to antifouling paints; methyl- and butyltins were detected, confirming human exposure in this type of work environment. Copyright © 2014 Elsevier B.V. All rights reserved.
Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan
2012-01-01
This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431
[Methodological Aspects of the Sampling Design for the 2015 National Mental Health Survey].
Rodríguez, Nelcy; Rodríguez, Viviana Alejandra; Ramírez, Eugenia; Cediel, Sandra; Gil, Fabián; Rondón, Martín Alonso
2016-12-01
The WHO has encouraged the development, implementation and evaluation of policies related to mental health all over the world. In Colombia, within this framework and promoted by the Ministry of Health and Social Protection, as well as being supported by Colciencias, the fourth National Mental Health Survey (NMHST) was conducted using a observational cross sectional study. According to the context and following the guidelines and sampling design, a summary of the methodology used for this sampling process is presented. The fourth NMHST used the Homes Master Sample for Studies in Health from the National System of Studies and Population Surveys for Health to calculate its sample. This Master Sample was developed and implemented in the year 2013 by the Ministry of Social Protection. This study included non-institutionalised civilian population divided into four age groups: children 7-11 years, adolescent 12-17 years, 18-44 years and 44 years old or older. The sample size calculation was based on the reported prevalences in other studies for the outcomes of mental disorders, depression, suicide, associated morbidity, and alcohol use. A probabilistic, cluster, stratified and multistage selection process was used. Expansions factors to the total population were calculated. A total of 15,351 completed surveys were collected and were distributed according to the age groups: 2727, 7-11 years, 1754, 12-17 years, 5889, 18-44 years, and 4981, ≥45 years. All the surveys were distributed in five regions: Atlantic, Oriental, Bogotá, Central and Pacific. A sufficient number of surveys were collected in this study to obtain a more precise approximation of the mental problems and disorders at the regional and national level. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Aviation Particle Emissions Workshop
NASA Technical Reports Server (NTRS)
Wey, Chowen C. (Editor)
2004-01-01
The Aviation Particle Emissions Workshop was held on November 18 19, 2003, in Cleveland, Ohio. It was sponsored by the National Aeronautic and Space Administration (NASA) under the Vehicle Systems Program (VSP) and the Ultra- Efficient Engine Technology (UEET) Project. The objectives were to build a sound foundation for a comprehensive particulate research roadmap and to provide a forum for discussion among U.S. stakeholders and researchers. Presentations included perspectives from the Federal Aviation Administration, the U.S. Environmental Protection Agency, NASA, and United States airports. There were five interactive technical sessions: sampling methodology, measurement methodology, particle modeling, database, inventory and test venue, and air quality. Each group presented technical issues which generated excellent discussion. The five session leads collaborated with their members to present summaries and conclusions to each content area.
Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu
2006-01-01
To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.
International gender bias in nursing research, 2005-2006: a quantitative content analysis.
Polit, Denise F; Beck, Cheryl Tatano
2009-08-01
This paper reports a study that examined the extent to which nurse researchers internationally disproportionately include females as participants in their research. A bias toward predominantly male samples has been well-documented in medical research, but recently a gender bias favoring women in nursing research has been identified in studies published in four North American journals. We extracted information about study samples and characteristics of the studies and authors from a consecutive sample of 834 studies published in eight leading English-language nursing research journals in 2005-2006. The primary analyses involved one-sample t-tests that tested the null hypothesis that males and females are equally represented as participants in nursing studies. Studies from different countries, in different specialty areas, and with varying author and methodologic characteristics were compared with regard to the key outcome variable, percent of participants who were female. Overall, 71% of participants, on average, were female, including 68% in client-focused research and 83% in nurse-focused studies (all p<.001). Females were significantly overrepresented as participants in client-focused research in almost all specialty areas, particularly in mental health, community health, health promotion, and geriatrics. The bias favoring female participants in client-focused studies was especially strong in the United States and Canada, but was also present in European countries, most Asian countries, and in Australia. Female overrepresentation was persistent, regardless of methodological characteristics (e.g., qualitative versus quantitative), funding source, and most researcher characteristics (e.g., academic rank). Studies with male authors, however, had more sex-balanced samples. The mean percentage female in client-focused studies with a female lead author was 70.0, compared to 52.1 for male lead authors. Nurse researchers not only in North America but around the globe need to pay attention to who will benefit from their research and to whether they are adequately inclusive in studying client groups about which there are knowledge gaps.
Microbiology: lessons from a first attempt at Lake Ellsworth.
Pearce, D A; Magiopoulos, I; Mowlem, M; Tranter, M; Holt, G; Woodward, J; Siegert, M J
2016-01-28
During the attempt to directly access, measure and sample Subglacial Lake Ellsworth in 2012-2013, we conducted microbiological analyses of the drilling equipment, scientific instrumentation, field camp and natural surroundings. From these studies, a number of lessons can be learned about the cleanliness of deep Antarctic subglacial lake access leading to, in particular, knowledge of the limitations of some of the most basic relevant microbiological principles. Here, we focus on five of the core challenges faced and describe how cleanliness and sterilization were implemented in the field. In the light of our field experiences, we consider how effective these actions were, and what can be learnt for future subglacial exploration missions. The five areas covered are: (i) field camp environment and activities, (ii) the engineering processes surrounding the hot water drilling, (iii) sample handling, including recovery, stability and preservation, (iv) clean access methodologies and removal of sample material, and (v) the biodiversity and distribution of bacteria around the Antarctic. Comparisons are made between the microbiology of the Lake Ellsworth field site and other Antarctic systems, including the lakes on Signy Island, and on the Antarctic Peninsula at Lake Hodgson. Ongoing research to better define and characterize the behaviour of natural and introduced microbial populations in response to deep-ice drilling is also discussed. We recommend that future access programmes: (i) assess each specific local environment in enhanced detail due to the potential for local contamination, (ii) consider the sterility of the access in more detail, specifically focusing on single cell colonization and the introduction of new species through contamination of pre-existing microbial communities, (iii) consider experimental bias in methodological approaches, (iv) undertake in situ biodiversity detection to mitigate risk of non-sample return and post-sample contamination, and (v) address the critical question of how important these microbes are in the functioning of Antarctic ecosystems. © 2015 The Author(s).
Eye-gaze determination of user intent at the computer interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-12-31
Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less
Texas Adolescent Tobacco and Marketing Surveillance System’s Design
Pérez, Adriana; Harrell, Melissa B.; Malkani, Raja I.; Jackson, Christian D.; Delk, Joanne; Allotey, Prince A.; Matthews, Krystin J.; Martinez, Pablo; Perry, Cheryl L.
2017-01-01
Objectives To provide a full methodological description of the design of the wave I and II (6-month follow-up) surveys of the Texas Adolescent Tobacco and Marketing Surveillance System (TATAMS), a longitudinal surveillance study of 6th, 8th, and 10th grade students who attended schools in Bexar, Dallas, Tarrant, Harris, or Travis counties, where the 4 largest cities in Texas (San Antonio, Dallas, Fort Worth, Houston, and Austin, respectively) are located. Methods TATAMS used a complex probability design, yielding representative estimates of these students in these counties during the 2014–2015 academic year. Weighted prevalence of the use of tobacco products, drugs and alcohol in wave I, and the percent of: (i) bias, (ii) relative bias, and (iii) relative bias ratio, between waves I and II are estimated. Results The wave I sample included 79 schools and 3,907 students. The prevalence of current cigarette, e-cigarette and hookah use at wave I was 3.5%, 7.4%, and 2.5%, respectively. Small biases, mostly less than 3.5%, were observed for nonrespondents in wave II. Conclusions Even with adaptions to the sampling methodology, the resulting sample adequately represents the target population. Results from TATAMS will have important implications for future tobacco policy in Texas and federal regulation. PMID:29098172
Lorenzetti, Valentina; Solowij, Nadia; Fornito, Alex; Lubman, Dan Ian; Yucel, Murat
2014-01-01
Cannabis is the most widely used illicit drug worldwide, though it is unclear whether its regular use is associated with persistent alterations in brain morphology. This review examines evidence from human structural neuroimaging investigations of regular cannabis users and focuses on achieving three main objectives. These include examining whether the literature to date provides evidence that alteration of brain morphology in regular cannabis users: i) is apparent, compared to non-cannabis using controls; ii) is associated with patterns of cannabis use; and with iii) measures of psychopathology and neurocognitive performance. The published findings indicate that regular cannabis use is associated with alterations in medial temporal, frontal and cerebellar brain regions. Greater brain morphological alterations were evident among samples that used at higher doses for longer periods. However, the evidence for an association between brain morphology and cannabis use parameters was mixed. Further, there is poor evidence for an association between measures of brain morphology and of psychopathology symptoms/neurocognitive performance. Overall, numerous methodological issues characterize the literature to date. These include investigation of small sample sizes, heterogeneity across studies in sample characteristics (e.g., sex, comorbidity) and in employed imaging techniques, as well as the examination of only a limited number of brain regions. These factors make it difficult to draw firm conclusions from the existing findings. Nevertheless, this review supports the notion that regular cannabis use is associated with alterations of brain morphology, and highlights the need to consider particular methodological issues when planning future cannabis research.
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
ERIC Educational Resources Information Center
Ndirangu, Caroline
2017-01-01
This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…
Armijo-Olivo, Susan; Cummings, Greta G.; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
Objectives To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. Methods We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Results Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955–2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. Conclusions The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed. PMID:29272315
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955-2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed.
BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1
Gary Chan, Kwun Chuen; Wang, Mei-Cheng
2011-01-01
Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167
Versatile electrophoresis-based self-test platform.
Guijt, Rosanne M
2015-03-01
Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Alvarez-Nemegyei, José; Buenfil-Rello, Fátima Annai; Pacheco-Pantoja, Elda Leonor
2016-01-01
Reports regarding the association between body composition and inflammatory activity in rheumatoid arthritis (RA) have consistently yielded contradictory results. To perform a systematic review on the association between overweight/obesity and inflammatory activity in RA. FAST approach: Article search (Medline, EBSCO, Cochrane Library), followed by abstract retrieval, full text review and blinded assessment of methodological quality for final inclusion. Because of marked heterogeneity in statistical approach and RA activity assessment method, a meta-analysis could not be done. Results are presented as qualitative synthesis. One hundred and nineteen reports were found, 16 of them qualified for full text review. Eleven studies (8,147 patients; n range: 37-5,161) approved the methodological quality filter and were finally included. Interobserver agreement for methodological quality score (ICC: 0.93; 95% CI: 0.82-0.98; P<.001) and inclusion/rejection decision (k 1.00, P>.001) was excellent. In all reports body composition was assessed by BMI; however a marked heterogeneity was found in the method used for RA activity assessment. A significant association between BMI and RA activity was found in 6 reports having larger mean sample size: 1,274 (range: 140-5,161). On the other hand, this association was not found in 5 studies having lower mean sample size: 100 (range: 7-150). The modulation of RA clinical status by body fat mass is suggested because a significant association was found between BMI and inflammatory activity in those reports with a trend toward higher statistical power. The relationship between body composition and clinical activity in RA requires be approached with further studies with higher methodological quality. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Plue, J; Colas, F; Auffret, A G; Cousins, S A O
2017-03-01
Persistent seed banks are a key plant regeneration strategy, buffering environmental variation to allow population and species persistence. Understanding seed bank functioning within herb layer dynamics is therefore important. However, rather than assessing emergence from the seed bank in herb layer gaps, most studies evaluate the seed bank functioning via a greenhouse census. We hypothesise that greenhouse data may not reflect seed bank-driven emergence in disturbance gaps due to methodological differences. Failure in detecting (specialist) species may then introduce methodological bias into the ecological interpretation of seed bank functions using greenhouse data. The persistent seed bank was surveyed in 40 semi-natural grassland plots across a fragmented landscape, quantifying seedling emergence in both the greenhouse and in disturbance gaps. Given the suspected interpretational bias, we tested whether each census uncovers similar seed bank responses to fragmentation. Seed bank characteristics were similar between censuses. Census type affected seed bank composition, with >25% of species retrieved better by either census type, dependent on functional traits including seed longevity, production and size. Habitat specialists emerged more in disturbance gaps than in the greenhouse, while the opposite was true for ruderal species. Both censuses uncovered fragmentation-induced seed bank patterns. Low surface area sampling, larger depth of sampling and germination conditions cause underrepresentation of the habitat-specialised part of the persistent seed bank flora during greenhouse censuses. Methodological bias introduced in the recorded seed bank data may consequently have significant implications for the ecological interpretation of seed bank community functions based on greenhouse data. © 2016 German Botanical Society and The Royal Botanical Society of the Netherlands.
Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F
2016-01-01
In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.
Zhang, Fengrui; Adeola, Olayiwola
2017-12-01
Sound feed formulation is dependent upon precise evaluation of energy and nutrients values in feed ingredients. Hence the methodology to determine the digestibility of energy and nutrients in feedstuffs should be chosen carefully before conducting experiments. The direct and difference procedures are widely used to determine the digestibility of energy and nutrients in feedstuffs. The direct procedure is normally considered when the test feedstuff can be formulated as the sole source of the component of interest in the test diet. However, in some cases where test ingredients can only be formulated to replace a portion of the basal diet to provide the component of interest, the difference procedure can be applied to get equally robust values. Based on components of interest, ileal digesta or feces can be collected, and different sample collection processes can be used. For example, for amino acids (AA), to avoid the interference of fermentation in the hind gut, ileal digesta samples are collected to determine the ileal digestibility and simple T-cannula and index method are commonly used techniques for AA digestibility analysis. For energy, phosphorus, and calcium, normally fecal samples will be collected to determine the total tract digestibility, and therefore the total collection method is recommended to obtain more accurate estimates. Concerns with the use of apparent digestibility values include different estimated values from different inclusion level and non-additivity in mixtures of feed ingredients. These concerns can be overcome by using standardized digestibility, or true digestibility, by correcting endogenous losses of components from apparent digestibility values. In this review, methodologies used to determine energy and nutrients digestibility in pigs are discussed. It is suggested that the methodology should be carefully selected based on the component of interest, feed ingredients, and available experimental facilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less
Efficient computation of the joint sample frequency spectra for multiple populations.
Kamm, John A; Terhorst, Jonathan; Song, Yun S
2017-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.
Efficient computation of the joint sample frequency spectra for multiple populations
Kamm, John A.; Terhorst, Jonathan; Song, Yun S.
2016-01-01
A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248
L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.
Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349
Rastall, Andrew C; Getting, Dominic; Goddard, Jon; Roberts, David R; Erdinger, Lothar
2006-07-01
Some anthropogenic pollutants posses the capacity to disrupt endogenous control of developmental and reproductive processes in aquatic biota by activating estrogen receptors. Many anthropogenic estrogen receptor agonists (ERAs) are hydrophobic and will therefore readily partition into the abiotic organic carbon phases present in natural waters. This partitioning process effectively reduces the proportion of ERAs readily available for bioconcentration by aquatic biota. Results from some studies have suggested that for many aquatic species, bioconcentration of the freely-dissolved fraction may be the principal route of uptake for hydrophobic pollutants with logarithm n-octanol/water partition coefficient (log Kow) values less than approximately 6.0, which includes the majority of known anthropogenic ERAs. The detection and identification of freely-dissolved readily bioconcentratable ERAs is therefore an important aspect of exposure and risk assessment. However, most studies use conventional techniques to sample total ERA concentrations and in doing so frequently fail to account for bioconcentration of the freely-dissolved fraction. The aim of the current study was to couple the biomimetic sampling properties of semipermeable membrane devices (SPMDs) to a bioassay-directed chemical analysis (BDCA) scheme for the detection and identification of readily bioconcentratable ERAs in surface waters. SPMDs were constructed and deployed at a number of sites in Germany and the UK. Following the dialytic recovery of target compounds and size exclusion chromatographic cleanup, SPMD samples were fractionated using a reverse-phase HPLC method calibrated to provide an estimation of target analyte log Kow. A portion of each HPLC fraction was then subjected to the yeast estrogen screen (YES) to determine estrogenic potential. Results were plotted in the form of 'estrograms' which displayed profiles of estrogenic potential as a function of HPLC retention time (i.e. hydrophobicity) for each of the samples. Where significant activity was elicited in the YES, the remaining portion of the respective active fraction was subjected to GC-MS analysis in an attempt to identify the ERAs present. Estrograms from each of the field samples showed that readily bioconcentratable ERAs were present at each of the sampling sites. Estimated log Kow values for the various active fractions ranged from 1.92 to 8.63. For some samples, estrogenic potential was associated with a relatively narrow range of log Kow values whilst in others estrogenic potential was more widely distributed across the respective estrograms. ERAs identified in active fractions included some benzophenones, various nonylphenol isomers, benzyl butyl phthalate, dehydroabietic acid, sitosterol, 3-(4-methylbenzylidine)camphor (4-MBC) and 6-acetyl-1,1,2,4,4,7-hexamethyltetralin (AHTN). Other tentatively identified compounds which may have contributed to the observed YES activity included various polycyclic aromatic hydrocarbons (PAHs) and their alkylated derivatives, methylated benzylphenols, various alkyl-phenols and dialkylphenols. However, potential ERAs present in some active fractions remain unidentified. Our results show that SPMD-YES-based BDCA can be used to detect and identify readily bioconcentratable ERAs in surface waters. As such, this biomimetic approach can be employed as an alternative to conventional methodologies to provide investigators with a more environmentally relevant insight into the distribution and identity of ERAs in surface waters. The use of alternative bioassays also has the potential to expand SPMD-based BDCA to include a wide range of toxicological endpoints. Improvements to the analytical methodology used to identify ERAs or other target compounds in active fractions in the current study could greatly enhance the applicability of the methodology to risk assessment and monitoring programmes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, Haiming; Lin, Yaojun; Seidman, David N.
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...
2015-09-09
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
Saadat, Victoria M
2015-01-01
The dissolution of the USSR resulted in independence for constituent republics but left them battling an unstable economic environment and healthcare. Increases in injection drug use, prostitution, and migration were all widespread responses to this transition and have contributed to the emergence of an HIV epidemic in the countries of former Soviet Union. Researchers have begun to identify the risks of HIV infection as well as the barriers to HIV testing and treatment in the former Soviet Union. Significant methodological challenges have arisen and need to be addressed. The objective of this review is to determine common threads in HIV research in the former Soviet Union and provide useful recommendations for future research studies. In this systematic review of the literature, Pubmed was searched for English-language studies using the key search terms "HIV", "AIDS", "human immunodeficiency virus", "acquired immune deficiency syndrome", "Central Asia", "Kazakhstan", "Kyrgyzstan", "Uzbekistan", "Tajikistan", "Turkmenistan", "Russia", "Ukraine", "Armenia", "Azerbaijan", and "Georgia". Studies were evaluated against eligibility criteria for inclusion. Thirty-nine studies were identified across the two main topic areas of HIV risk and barriers to testing and treatment, themes subsequently referred to as "risk" and "barriers". Study design was predominantly cross-sectional. The most frequently used sampling methods were peer-to-peer and non-probabilistic sampling. The most frequently reported risks were condom misuse, risky intercourse, and unsafe practices among injection drug users. Common barriers to testing included that testing was inconvenient, and that results would not remain confidential. Frequent barriers to treatment were based on a distrust in the treatment system. The findings of this review reveal methodological limitations that span the existing studies. Small sample size, cross-sectional design, and non-probabilistic sampling methods were frequently reported limitations. Future work is needed to examine barriers to testing and treatment as well as longitudinal studies on HIV risk over time in most-at-risk populations.
A review of the additive health risk of cannabis and tobacco co-use.
Meier, Ellen; Hatsukami, Dorothy K
2016-09-01
Cannabis and tobacco are the most widely used substances, and are often used together. The present review examines the toxicant exposure associated with co-use (e.g., carbon monoxide, carcinogens), co-use via electronic nicotine delivery systems (ENDS), and problematic methodological issues present across co-use studies. An extensive literature search through PubMed was conducted and studies utilizing human subjects and in vitro methods were included. Keywords included tobacco, cigarette, e-cigarette, ENDS, smoking, or nicotine AND marijuana OR cannabis OR THC. Co-use may pose additive risk for toxicant exposure as certain co-users (e.g., blunt users) tend to have higher breath carbon monoxide levels and cannabis smoke can have higher levels of some carcinogens than tobacco smoke. Cannabis use via ENDS is low and occurs primarily among established tobacco or cannabis users, but its incidence may be increasing and expanding to tobacco/cannabis naïve individuals. There are several methodological issues across co-use research including varying definitions of co-use, sample sizes, lack of control for important covariates (e.g., time since last cigarette), and inconsistent measurement of outcome variables. There are some known additive risks for toxicant exposure as a result of co-use. Research utilizing consistent methodologies is needed to further establish the additive risk of co-use. Future research should also be aware of novel technologies (e.g., ENDS) as they likely alter some toxicant exposure when used alone and with cannabis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample
Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.
2012-01-01
Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609
Polyphenols excreted in urine as biomarkers of total polyphenol intake.
Medina-Remón, Alexander; Tresserra-Rimbau, Anna; Arranz, Sara; Estruch, Ramón; Lamuela-Raventos, Rosa M
2012-11-01
Nutritional biomarkers have several advantages in acquiring data for epidemiological and clinical studies over traditional dietary assessment tools, such as food frequency questionnaires. While food frequency questionnaires constitute a subjective methodology, biomarkers can provide a less biased and more accurate measure of specific nutritional intake. A precise estimation of polyphenol consumption requires blood or urine sample biomarkers, although their association is usually highly complex. This article reviews recent research on urinary polyphenols as potential biomarkers of polyphenol intake, focusing on clinical and epidemiological studies. We also report a potentially useful methodology to assess total polyphenols in urine samples, which allows a rapid, simultaneous determination of total phenols in a large number of samples. This methodology can be applied in studies evaluating the utility of urinary polyphenols as markers of polyphenol intake, bioavailability and accumulation in the body.
D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco
2013-01-01
This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.
Korsgaard, Inge Riis; Lund, Mogens Sandø; Sorensen, Daniel; Gianola, Daniel; Madsen, Per; Jensen, Just
2003-01-01
A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories) or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed. PMID:12633531
Teodoro, Janaína Aparecida Reis; Pereira, Hebert Vinicius; Sena, Marcelo Martins; Piccin, Evandro; Zacca, Jorge Jardim; Augusti, Rodinei
2017-12-15
A direct method based on the application of paper spray mass spectrometry (PS-MS) combined with a chemometric supervised method (partial least square discriminant analysis, PLS-DA) was developed and applied to the discrimination of authentic and counterfeit samples of blended Scottish whiskies. The developed methodology employed the negative ion mode MS, included 44 authentic whiskies from diverse brands and batches and 44 counterfeit samples of the same brands seized during operations of the Brazilian Federal Police, totalizing 88 samples. An exploratory principal component analysis (PCA) model showed a reasonable discrimination of the counterfeit whiskies in PC2. In spite of the samples heterogeneity, a robust, reliable and accurate PLS-DA model was generated and validated, which was able to correctly classify the samples with nearly 100% success rate. The use of PS-MS also allowed the identification of the main marker compounds associated with each type of sample analyzed: authentic or counterfeit. Copyright © 2017 Elsevier Ltd. All rights reserved.
Tucker, Jalie A; Simpson, Cathy A; Chandler, Susan D; Borch, Casey A; Davies, Susan L; Kerbawy, Shatomi J; Lewis, Terri H; Crawford, M Scott; Cheong, JeeWon; Michael, Max
2016-01-01
Emerging adulthood often entails heightened risk-taking with potential life-long consequences, and research on risk behaviors is needed to guide prevention programming, particularly in under-served and difficult to reach populations. This study evaluated the utility of Respondent Driven Sampling (RDS), a peer-driven methodology that corrects limitations of snowball sampling, to reach at-risk African American emerging adults from disadvantaged urban communities. Initial "seed" participants from the target group recruited peers, who then recruited their peers in an iterative process (110 males, 234 females; M age = 18.86 years). Structured field interviews assessed common health risk factors, including substance use, overweight/obesity, and sexual behaviors. Established gender-and age-related associations with risk factors were replicated, and sample risk profiles and prevalence estimates compared favorably with matched samples from representative U.S. national surveys. Findings supported the use of RDS as a sampling method and grassroots platform for research and prevention with community-dwelling risk groups.
2016-05-01
and Kroeger (2002) provide details on sampling and weighting. Following the summary of the survey methodology is a description of the survey analysis... description of priority, for the ADDRESS file). At any given time, the current address used corresponded to the address number with the highest priority...types of address updates provided by the postal service. They are detailed below; each includes a description of the processing steps. 1. Postal Non
NASA Technical Reports Server (NTRS)
Boyce, Lola; Lovelace, Thomas B.
1989-01-01
FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Lovelace, Thomas B.
1989-01-01
FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.
Ozminkowski, R J; Goetzel, R Z
2001-01-01
The authors describe the most important methodological challenges often encountered in conducting research and evaluation on the financial impact of health promotion. These include selection bias, skewed data, small sample size, metrics. They discuss when these problems can and cannot be overcome and suggest how some of these problems can be overcome through a creating an appropriate framework for the study, and using state of the art statistical methods.
Demonstration of Incremental Sampling Methodology for Soil Containing Metallic Residues
2013-09-01
and includes metamorphic , sedimentary, and volcanic rocks of Paleozoic age (Péwé et al. 1966). Upland areas adjacent to the Tanana River usually are...as 5 m of silt, Late Pleistocene to Holocene in age. Gravel con- sists mostly of quartz and metamorphic rock with clasts ranging from 0.3 to 7.5 cm in...and Shawna Tazik September 2013 Approved for public release; distribution is unlimited. The US Army Engineer Research and
Rapid Analysis of Ash Composition Using Laser-Induced Breakdown Spectroscopy (LIBS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tyler L. Westover
2013-01-01
Inorganic compounds are known to be problematic in the thermochemical conversion of biomass to syngas and ultimately hydrocarbon fuels. The elements Si, K, Ca, Na, S, P, Cl, Mg, Fe, and Al are particularly problematic and are known to influence reaction pathways, contribute to fouling and corrosion, poison catalysts, and impact waste streams. Substantial quantities of inorganic species can be entrained in the bark of trees during harvest operations. Herbaceous feedstocks often have even greater quantities of inorganic constituents, which can account for as much as one-fifth of the total dry matter. Current methodologies to measure the concentrations of thesemore » elements, such as inductively coupled plasma-optical emission spectrometry/mass spectrometry (ICP-OES/MS) are expensive in time and reagents. This study demonstrates that a new methodology employing laser-induced breakdown spectroscopy (LIBS) can rapidly and accurately analyze the inorganic constituents in a wide range of biomass materials, including both woody and herbaceous examples. This technique requires little or no sample preparation, does not consume any reagents, and the analytical data is available immediately. In addition to comparing LIBS data with the results from ICP-OES methods, this work also includes discussions of sample preparation techniques, calibration curves for interpreting LIBS spectra, minimum detection limits, and the use of internal standards and standard reference materials.« less
NASA Astrophysics Data System (ADS)
Sepúlveda, J.; Hoyos Ortiz, C. D.
2017-12-01
An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.
NASA Astrophysics Data System (ADS)
Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza
2018-03-01
This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.
Credit risk migration rates modeling as open systems: A micro-simulation approach
NASA Astrophysics Data System (ADS)
Landini, S.; Uberti, M.; Casellina, S.
2018-05-01
The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.
Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets
2017-07-01
principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for
Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments
2016-03-24
NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION
77 FR 6971 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six months.../CDC's analysis of costs to the Government is based on the current methodology (ELISA) used to test NHP... different methodology or changes in the availability of ELISA reagents will affect the amount of the user...
In situ study of live specimens in an environmental scanning electron microscope.
Tihlaříková, Eva; Neděla, Vilém; Shiojiri, Makoto
2013-08-01
In this paper we introduce new methodology for the observation of living biological samples in an environmental scanning electron microscope (ESEM). The methodology is based on an unconventional initiation procedure for ESEM chamber pumping, free from purge-flood cycles, and on the ability to control thermodynamic processes close to the sample. The gradual and gentle change of the working environment from air to water vapor enables the study of not only living samples in dynamic in situ experiments and their manifestation of life (sample walking) but also its experimentally stimulated physiological reactions. Moreover, Monte Carlo simulations of primary electron beam energy losses in a water layer on the sample surface were studied; consequently, the influence of the water thickness on radiation, temperature, or chemical damage of the sample was considered.
Vernazza, Christopher R; Carr, Katherine; Wildman, John; Gray, Joanne; Holmes, Richard D; Exley, Catherine; Smith, Robert A; Donaldson, Cam
2018-06-22
Resources in any healthcare systems are scarce relative to need and therefore choices need to be made which often involve difficult decisions about the best allocation of these resources. One pragmatic and robust tool to aid resource allocation is Programme Budgeting and Marginal Analysis (PBMA), but there is mixed evidence on its uptake and effectiveness. Furthermore, there is also no evidence on the incorporation of the preferences of a large and representative sample of the general public into such a process. The study therefore aims to undertake, evaluate and refine a PBMA process within the exemplar of NHS dentistry in England whilst also using an established methodology (Willingness to Pay (WTP)) to systematically gather views from a representative sample of the public. Stakeholders including service buyers (commissioners), dentists, dental public health representatives and patient representatives will be recruited to participate in a PBMA process involving defining current spend, agreeing criteria to judge services/interventions, defining areas for investment and disinvestment, rating these areas against the criteria and making final recommendations. The process will be refined based on participatory action research principles and evaluated through semi-structured interviews, focus groups and observation of the process by the research team. In parallel a representative sample of English adults will be recruited to complete a series of four surveys including WTP valuations of programmes being considered by the PBMA panel. In addition a methodological experiment comparing two ways of eliciting WTP will be undertaken. The project will allow the PBMA process and particularly the use of WTP within it to be investigated and developed. There will be challenges around engagement with the task by the panel undertaking it and with the outputs by stakeholders but careful relationship building will help to mitigate this. The large volume of data will be managed through careful segmenting of the analysis and the use of the well-established Framework approach to qualitative data analysis. WTP has various potential biases but the elicitation will be carefully designed to minimise these and some methodological investigation will take place.
Modified graphene oxide sensors for ultra-sensitive detection of nitrate ions in water.
Ren, Wen; Mura, Stefania; Irudayaraj, Joseph M K
2015-10-01
Nitrate ions is a very common contaminant in drinking water and has a significant impact on the environment, necessitating routine monitoring. Due to its chemical and physical properties, it is hard to directly detect nitrate ions with high sensitivity in a simple and inexpensive manner. Herein with amino group modified graphene oxide (GO) as a sensing element, we show a direct and ultra-sensitive method to detect nitrate ions, at a lowest detected concentration of 5 nM in river water samples, much lower than the reported methods based on absorption spectroscopy. Furthermore, unlike the reported strategies based on absorption spectroscopy wherein the nitrate concentration is determined by monitoring an increase in aggregation of gold nanoparticles (GNPs), our method evaluates the concentration of nitrate ions based on reduction in aggregation of GNPs for monitoring in real samples. To improve sensitivity, several optimizations were performed, including the assessment of the amount of modified GO required, concentration of GNPs and incubation time. The detection methodology was characterized by zeta potential, TEM and SEM. Our results indicate that an enrichment of modified GO with nitrate ions contributed to excellent sensitivity and the entire detection procedure could be completed within 75 min with only 20 μl of sample. This simple and rapid methodology was applied to monitor nitrate ions in real samples with excellent sensitivity and minimum pretreatment. The proposed approach paves the way for a novel means to detect anions in real samples and highlights the potential of GO based detection strategy for water quality monitoring. Copyright © 2015 Elsevier B.V. All rights reserved.
Grant, Susan; Grant, William D; Cowan, Don A; Jones, Brian E; Ma, Yanhe; Ventosa, Antonio; Heaphy, Shaun
2006-01-01
Here we describe the application of metagenomic technologies to construct cDNA libraries from RNA isolated from environmental samples. RNAlater (Ambion) was shown to stabilize RNA in environmental samples for periods of at least 3 months at -20 degrees C. Protocols for library construction were established on total RNA extracted from Acanthamoeba polyphaga trophozoites. The methodology was then used on algal mats from geothermal hot springs in Tengchong county, Yunnan Province, People's Republic of China, and activated sludge from a sewage treatment plant in Leicestershire, United Kingdom. The Tenchong libraries were dominated by RNA from prokaryotes, reflecting the mainly prokaryote microbial composition. The majority of these clones resulted from rRNA; only a few appeared to be derived from mRNA. In contrast, many clones from the activated sludge library had significant similarity to eukaryote mRNA-encoded protein sequences. A library was also made using polyadenylated RNA isolated from total RNA from activated sludge; many more clones in this library were related to eukaryotic mRNA sequences and proteins. Open reading frames (ORFs) up to 378 amino acids in size could be identified. Some resembled known proteins over their full length, e.g., 36% match to cystatin, 49% match to ribosomal protein L32, 63% match to ribosomal protein S16, 70% to CPC2 protein. The methodology described here permits the polyadenylated transcriptome to be isolated from environmental samples with no knowledge of the identity of the microorganisms in the sample or the necessity to culture them. It has many uses, including the identification of novel eukaryotic ORFs encoding proteins and enzymes.
Booth, Andrew
2016-05-04
Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers the prospect of rapid development of search methods.
Ultra-Low Background Measurements Of Decayed Aerosol Filters
NASA Astrophysics Data System (ADS)
Miley, H.
2009-04-01
To experimentally evaluate the opportunity to apply ultra-low background measurement methods to samples collected, for instance, by the Comprehensive Test Ban Treaty International Monitoring System (IMS), aerosol samples collected on filter media were measured using HPGe spectrometers of varying low-background technology approaches. In this way, realistic estimates of the impact of low-background methodology can be assessed on the Minimum Detectable Activities obtained in systems such as the IMS. The current measurement requirement of stations in the IMS is 30 microBq per cubic meter of air for 140Ba, or about 106 fissions per daily sample. Importantly, this is for a fresh aerosol filter. Decay varying form 3 days to one week reduce the intrinsic background from radon daughters in the sample. Computational estimates of the improvement factor for these decayed filters for underground-based HPGe in clean shielding materials are orders of magnitude less, even when the decay of the isotopes of interest is included.
Diabetes and end of life: ethical and methodological issues in gathering evidence to guide care.
Dunning, Trisha; Duggan, Nicole; Savage, Sally; Martin, Peter
2013-03-01
Providing palliative care for people with diabetes at the end of life is part of the chronic disease care trajectory, but end of life care is complex and the presence of diabetes further complicates management. The aim of the paper is to discuss the ethical and methodological issues encountered when undertaking research to develop guidelines for managing diabetes at the end of life and the strategies used to address the issues. The issues emerged as we developed guidelines for managing diabetes at the end of life, which included conducting individual interviews with 14 people with diabetes requiring palliative care and 10 family members. A reflexive researcher journal was maintained throughout the guideline development process. The interview transcripts and researcher's journal were analysed to determine key methodological, ethical and researcher-related issues. Key themes were vulnerability of the sampling population, methodological issues included recruiting participants and ensuring rigor, ethical issues concerned benefit and risk, justice, autonomy, privacy, professional boundaries and informed consent. Researcher-related issues were identified such as managing participant distress and their own emotional distress. People were willing to discuss end of life diabetes management preferences. Undertaking research with people at the end of life is complex because of their vulnerability and the ethical issues involved. However, the ethical principles of autonomy and justice apply and people should be given the relevant information and opportunity to decide whether to participate or not. © 2012 The Authors. Scandinavian Journal of Caring Sciences © 2012 Nordic College of Caring Science.
Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen
2015-02-01
Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.
Olivo, Susan Armijo; Bravo, Jaime; Magee, David J; Thie, Norman M R; Major, Paul W; Flores-Mir, Carlos
2006-01-01
To carry out a systematic review to assess the evidence concerning the association between head and cervical posture and temporomandibular disorders (TMD). A search of Medline, Pubmed, Embase, Web of Science, Lilacs, and Cochrane Library databases was conducted in all languages with the help of a health sciences librarian. Key words used in the search were posture, head posture, cervical spine or neck, vertebrae, cervical lordosis, craniomandibular disorders or temporomandibular disorders, temporomandibular disorders, and orofacial pain or facial pain. Abstracts which appeared to fulfill the initial selection criteria were selected by consensus. The original articles were retrieved and evaluated to ensure they met the inclusion criteria. A methodological checklist was used to evaluate the quality of the selected articles and their references were hand-searched for possible missing articles. Twelve studies met all inclusion criteria and were analyzed in detail for their methodology and information quality. Nine articles that analyzed the association between head posture and TMD included patients with mixed TMD diagnosis; 1 article differentiated among muscular, articular, and mixed symptomatology; and 3 articles analyzed information from patients with only articular problems. Finally, 2 studies evaluated the association between head posture and TMD in patients with muscular TMD. Several methodological defects were noted in the 12 studies. Since most of the studies included in this systematic review were of poor methodological quality, the findings of the studies should be interpreted with caution. The association between intra-articular and muscular TMD and head and cervical posture is still unclear, and better controlled studies with comprehensive TMD diagnoses, greater sample sizes, and objective posture evaluation are necessary.
Prediction and standard error estimation for a finite universe total when a stratum is not sampled
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, T.
1994-01-01
In the context of a universe of trucks operating in the United States in 1990, this paper presents statistical methodology for estimating a finite universe total on a second occasion when a part of the universe is sampled and the remainder of the universe is not sampled. Prediction is used to compensate for the lack of data from the unsampled portion of the universe. The sample is assumed to be a subsample of an earlier sample where stratification is used on both occasions before sample selection. Accounting for births and deaths in the universe between the two points in time,more » the detailed sampling plan, estimator, standard error, and optimal sample allocation, are presented with a focus on the second occasion. If prior auxiliary information is available, the methodology is also applicable to a first occasion.« less
Lami, Francesca; Egberts, Kristine; Ure, Alexandra; Conroy, Rowena; Williams, Katrina
2018-03-01
To systematically review the measurement properties of instruments assessing participation in young people with autism spectrum disorder (ASD). A search was performed in MEDLINE, PsycINFO, and PubMed combining three constructs ('ASD', 'test of participation', 'measurement properties'). Results were restricted to articles including people aged 6 to 29 years. The 2539 identified articles were independently screened by two reviewers. For the included articles, data were extracted using standard forms and their risk of bias was assessed. Nine studies (8 cross-sectional) met the inclusion criteria, providing information on seven different instruments. The total sample included 634 participants, with sex available for 600 (males=494; females=106) and age available for 570, with mean age for these participants 140.58 months (SD=9.11; range=36-624). Included instruments were the school function assessment, vocational index, children's assessment of participation and enjoyment/preferences for activities of children, experience sampling method, Pediatric Evaluation of Disability Inventory, Computer Adaptive Test, adolescent and young adult activity card sort, and Patient-Reported Outcomes Measurement Information System parent-proxy peer relationships. Seven studies assessed reliability and validity; good properties were reported for half of the instruments considered. Most studies (n=6) had high risk of bias. Overall the quality of the evidence for each tool was limited. Validation of these instruments, or others that comprehensively assess participation, is needed. Future studies should follow recommended methodological standards. Seven instruments have been used to assess participation in young people with autism. One instrument, with excellent measurement properties in one study, does not comprehensively assess participation. Studies of three instruments that incorporate a more comprehensive assessment of participation have methodological limitations. Overall, limited evidence exists regarding measurement properties of participation assessments for young people with autism. © 2017 Mac Keith Press.
Comparing EQ-5D valuation studies: a systematic review and methodological reporting checklist.
Xie, Feng; Gaebel, Kathryn; Perampaladas, Kuhan; Doble, Brett; Pullenayegum, Eleanor
2014-01-01
There has been a growing interest around the world in developing country-specific scoring algorithms for the EQ-5D. This study systematically reviews all existing EQ-5D valuation studies to highlight their strengths and limitations, explores heterogeneity in observed utilities using meta-regression, and proposes a methodological checklist for reporting EQ-5D valuation studies. . We searched Medline, EMBASE, the National Health Service Economic Evaluation Database (NHS EED) via Wiley's Cochrane Library, and Wiley's Health Economic Evaluation Database from inception through November 2012, as well as bibliographies of key papers and the EuroQol Plenary Meeting Proceedings from 1991 to 2012 for English-language reports of EQ-5D valuation studies. Two reviewers independently screened the titles and abstracts for relevance. Three reviewers performed data extraction and compared the characteristics and scoring algorithms developed in the included valuation studies. . Of the 31 studies included in the review, 19 used the time trade-off (TTO) technique, 10 used the visual analogue scale (VAS) technique, and 2 used both TTO and VAS. Most studies included respondents from the general population selected by random or quota sampling and used face-to-face interviews or postal surveys. Studies valued between 7 and 198 total states, with 1-23 states valued per respondent. Different model specifications have been proposed for scoring. Some sample or demographic factors, including gender, education, percentage urban population, and national health care expenditure, were associated with differences in observed utilities for moderate or severe health states. . EQ-5D valuation studies conducted to date have varied widely in their design and in the resulting scoring algorithms. Therefore, we propose the Checklist for Reporting Valuation Studies of the EQ-5D (CREATE) for those conducting valuation studies.
Thompson, Laura A.; Spoon, Tracey R.; Goertz, Caroline E. C.; Hobbs, Roderick C.; Romano, Tracy A.
2014-01-01
Non-invasive sampling techniques are increasingly being used to monitor glucocorticoids, such as cortisol, as indicators of stressor load and fitness in zoo and wildlife conservation, research and medicine. For cetaceans, exhaled breath condensate (blow) provides a unique sampling matrix for such purposes. The purpose of this work was to develop an appropriate collection methodology and validate the use of a commercially available EIA for measuring cortisol in blow samples collected from belugas (Delphinapterus leucas). Nitex membrane stretched over a petri dish provided the optimal method for collecting blow. A commercially available cortisol EIA for measuring human cortisol (detection limit 35 pg ml−1) was adapted and validated for beluga cortisol using tests of parallelism, accuracy and recovery. Blow samples were collected from aquarium belugas during monthly health checks and during out of water examination, as well as from wild belugas. Two aquarium belugas showed increased blow cortisol between baseline samples and 30 minutes out of water (Baseline, 0.21 and 0.04 µg dl−1; 30 minutes, 0.95 and 0.14 µg dl−1). Six wild belugas also showed increases in blow cortisol between pre and post 1.5 hour examination (Pre 0.03, 0.23, 0.13, 0.19, 0.13, 0.04 µg dl−1, Post 0.60, 0.31, 0.36, 0.24, 0.14, 0.16 µg dl−1). Though this methodology needs further investigation, this study suggests that blow sampling is a good candidate for non-invasive monitoring of cortisol in belugas. It can be collected from both wild and aquarium animals efficiently for the purposes of health monitoring and research, and may ultimately be useful in obtaining data on wild populations, including endangered species, which are difficult to handle directly. PMID:25464121
Fernández, Elena; Vidal, Lorena; Martín-Yerga, Daniel; Blanco, María del Carmen; Canals, Antonio; Costa-García, Agustín
2015-04-01
A novel approach is presented, whereby gold nanostructured screen-printed carbon electrodes (SPCnAuEs) are combined with in-situ ionic liquid formation dispersive liquid-liquid microextraction (in-situ IL-DLLME) and microvolume back-extraction for the determination of mercury in water samples. In-situ IL-DLLME is based on a simple metathesis reaction between a water-miscible IL and a salt to form a water-immiscible IL into sample solution. Mercury complex with ammonium pyrrolidinedithiocarbamate is extracted from sample solution into the water-immiscible IL formed in-situ. Then, an ultrasound-assisted procedure is employed to back-extract the mercury into 10 µL of a 4 M HCl aqueous solution, which is finally analyzed using SPCnAuEs. Sample preparation methodology was optimized using a multivariate optimization strategy. Under optimized conditions, a linear range between 0.5 and 10 µg L(-1) was obtained with a correlation coefficient of 0.997 for six calibration points. The limit of detection obtained was 0.2 µg L(-1), which is lower than the threshold value established by the Environmental Protection Agency and European Union (i.e., 2 µg L(-1) and 1 µg L(-1), respectively). The repeatability of the proposed method was evaluated at two different spiking levels (3 and 10 µg L(-1)) and a coefficient of variation of 13% was obtained in both cases. The performance of the proposed methodology was evaluated in real-world water samples including tap water, bottled water, river water and industrial wastewater. Relative recoveries between 95% and 108% were obtained. Copyright © 2014 Elsevier B.V. All rights reserved.
Thompson, Laura A; Spoon, Tracey R; Goertz, Caroline E C; Hobbs, Roderick C; Romano, Tracy A
2014-01-01
Non-invasive sampling techniques are increasingly being used to monitor glucocorticoids, such as cortisol, as indicators of stressor load and fitness in zoo and wildlife conservation, research and medicine. For cetaceans, exhaled breath condensate (blow) provides a unique sampling matrix for such purposes. The purpose of this work was to develop an appropriate collection methodology and validate the use of a commercially available EIA for measuring cortisol in blow samples collected from belugas (Delphinapterus leucas). Nitex membrane stretched over a petri dish provided the optimal method for collecting blow. A commercially available cortisol EIA for measuring human cortisol (detection limit 35 pg ml-1) was adapted and validated for beluga cortisol using tests of parallelism, accuracy and recovery. Blow samples were collected from aquarium belugas during monthly health checks and during out of water examination, as well as from wild belugas. Two aquarium belugas showed increased blow cortisol between baseline samples and 30 minutes out of water (Baseline, 0.21 and 0.04 µg dl-1; 30 minutes, 0.95 and 0.14 µg dl-1). Six wild belugas also showed increases in blow cortisol between pre and post 1.5 hour examination (Pre 0.03, 0.23, 0.13, 0.19, 0.13, 0.04 µg dl-1, Post 0.60, 0.31, 0.36, 0.24, 0.14, 0.16 µg dl-1). Though this methodology needs further investigation, this study suggests that blow sampling is a good candidate for non-invasive monitoring of cortisol in belugas. It can be collected from both wild and aquarium animals efficiently for the purposes of health monitoring and research, and may ultimately be useful in obtaining data on wild populations, including endangered species, which are difficult to handle directly.
McDade, Thomas W; Williams, Sharon; Snodgrass, J Josh
2007-11-01
Logistical constraints associated with the collection and analysis of biological samples in community-based settings have been a significant impediment to integrative, multilevel bio-demographic and biobehavioral research. However recent methodological developments have overcome many of these constraints and have also expanded the options for incorporating biomarkers into population-based health research in international as well as domestic contexts. In particular using dried blood spot (DBS) samples-drops of whole blood collected on filter paper from a simple finger prick-provides a minimally invasive method for collecting blood samples in nonclinical settings. After a brief discussion of biomarkers more generally, we review procedures for collecting, handling, and analyzing DBS samples. Advantages of using DBS samples-compared with venipuncture include the relative ease and low cost of sample collection, transport, and storage. Disadvantages include requirements for assay development and validation as well as the relatively small volumes of sample. We present the results of a comprehensive literature review of published protocols for analysis of DBS samples, and we provide more detailed analysis of protocols for 45 analytes likely to be of particular relevance to population-level health research. Our objective is to provide investigators with the information they need to make informed decisions regarding the appropriateness of blood spot methods for their research interests.
Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling
Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.
2012-01-01
Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055
Methods development for total organic carbon accountability
NASA Technical Reports Server (NTRS)
Benson, Brian L.; Kilgore, Melvin V., Jr.
1991-01-01
This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.
Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.
2014-01-01
Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754
Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J
2014-12-01
Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.
Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A
2018-05-01
The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2 = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.
A methodology to assess performance of human-robotic systems in achievement of collective tasks
NASA Technical Reports Server (NTRS)
Howard, Ayanna M.
2005-01-01
In this paper, we present a methodology to assess system performance of human-robotic systems in achievement of collective tasks such as habitat construction, geological sampling, and space exploration.
Giménez, Estela; Balmaña, Meritxell; Figueras, Joan; Fort, Esther; de Bolós, Carme; Sanz-Nebot, Victòria; Peracaula, Rosa; Rizzi, Andreas
2015-03-25
In this work we demonstrate the potential of glycan reductive isotope labeling (GRIL) using [(12)C]- and [(13)C]-coded aniline and zwitterionic hydrophilic interaction capillary liquid chromatography electrospray mass spectrometry (μZIC-HILIC-ESI-MS) for relative quantitation of glycosylation variants in selected glycoproteins present in samples from cancer patients. Human α1-acid-glycoprotein (hAGP) is an acute phase serum glycoprotein whose glycosylation has been described to be altered in cancer and chronic inflammation. However, it is not clear yet whether some particular glycans in hAGP can be used as biomarker for differentiating between these two pathologies. In this work, hAGP was isolated by immunoaffinity chromatography (IAC) from serum samples of healthy individuals and from those suffering chronic pancreatitis and different stages of pancreatic cancer, respectively. After de-N-glycosylation, relative quantitation of the hAGP glycans was carried out using stable isotope labeling and μZIC-HILIC-ESI-MS analysis. First, protein denaturing conditions prior to PNGase F digestion were optimized to achieve quantitative digestion yields, and the reproducibility of the established methodology was evaluated with standard hAGP. Then, the proposed method was applied to the analysis of the clinical samples (control vs. pathological). Pancreatic cancer samples clearly showed an increase in the abundance of fucosylated glycans as the stage of the disease increases and this was unlike to samples from chronic pancreatitis. The results gained here indicate the mentioned glycan in hAGP as a candidate structure worth to be corroborated by an extended study including more clinical cases; especially those with chronic pancreatitis and initial stages of pancreatic cancer. Importantly, the results demonstrate that the presented methodology combining an enrichment of a target protein by IAC with isotope coded relative quantitation of N-glycans can be successfully used for targeted glycomics studies. The methodology is assumed being suitable as well for other such studies aimed at finding novel cancer associated glycoprotein biomarkers. Copyright © 2015 Elsevier B.V. All rights reserved.
Gill, Emily L; Marks, Megan; Yost, Richard A; Vedam-Mai, Vinata; Garrett, Timothy J
2017-12-19
Liquid-microjunction surface sampling (LMJ-SS) is an ambient ionization technique based on the continuous flow of solvent using an in situ microextraction device in which solvent moves through the probe, drawing in the analytes in preparation for ionization using an electrospray ionization source. However, unlike traditional mass spectrometry (MS) techniques, it operates under ambient pressure and requires no sample preparation, thereby making it ideal for rapid sampling of thicker tissue sections for electrophysiological and other neuroscientific research studies. Studies interrogating neural synapses, or a specific neural circuit, typically employ thick, ex vivo tissue sections maintained under near-physiological conditions to preserve tissue viability and maintain the neural networks. Deep brain stimulation (DBS) is a surgical procedure used to treat the neurological symptoms that are associated with certain neurodegenerative and neuropsychiatric diseases. Parkinson's disease (PD) is a neurological disorder which is commonly treated with DBS therapy. PD is characterized by the degeneration of dopaminergic neurons in the substantia nigra pars compacta portion of the brain. Here, we demonstrate that the LMJ-SS methodology can provide a platform for ex vivo analysis of the brain during electrical stimulation, such as DBS. We employ LMJ-SS in the ex vivo analysis of mouse brain tissue for monitoring dopamine during electrical stimulation of the striatum region. The mouse brain tissue was sectioned fresh post sacrifice and maintained in artificial cerebrospinal fluid to create near-physiological conditions before direct sampling using LMJ-SS. A selection of metabolites, including time-sensitive metabolites involved in energy regulation in the brain, were identified using standards, and the mass spectral database mzCloud was used to assess the feasibility of the methodology. Thereafter, the intensity of m/z 154 corresponding to protonated dopamine was monitored before and after electrical stimulation of the striatum region, showing an increase in signal directly following a stimulation event. Dopamine is the key neurotransmitter implicated in PD, and although electrochemical detectors have shown such increases in dopamine post-DBS, this is the first study to do so using MS methodologies.
A Validated Methodology for Genetic Identification of Tuna Species (Genus Thunnus)
Viñas, Jordi; Tudela, Sergi
2009-01-01
Background Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. Methodology After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Conclusions Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned. PMID:19898615
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Sequencing CYP2D6 for the detection of poor-metabolizers in post-mortem blood samples with tramadol.
Fonseca, Suzana; Amorim, António; Costa, Heloísa Afonso; Franco, João; Porto, Maria João; Santos, Jorge Costa; Dias, Mário
2016-08-01
Tramadol concentrations and analgesic effect are dependent on the CYP2D6 enzymatic activity. It is well known that some genetic polymorphisms are responsible for the variability in the expression of this enzyme and in the individual drug response. The detection of allelic variants described as non-functional can be useful to explain some circumstances of death in the study of post-mortem cases with tramadol. A Sanger sequencing methodology was developed for the detection of genetic variants that cause absent or reduced CYP2D6 activity, such as *3, *4, *6, *8, *10 and *12 alleles. This methodology, as well as the GC/MS method for the detection and quantification of tramadol and its main metabolites in blood samples was fully validated in accordance with international guidelines. Both methodologies were successfully applied to 100 post-mortem blood samples and the relation between toxicological and genetic results evaluated. Tramadol metabolism, expressed as its metabolites concentration ratio (N-desmethyltramadol/O-desmethyltramadol), has been shown to be correlated with the poor-metabolizer phenotype based on genetic characterization. It was also demonstrated the importance of enzyme inhibitors identification in toxicological analysis. According to our knowledge, this is the first study where a CYP2D6 sequencing methodology is validated and applied to post-mortem samples, in Portugal. The developed methodology allows the data collection of post-mortem cases, which is of primordial importance to enhance the application of these genetic tools to forensic toxicology and pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Recent approaches for enhancing sensitivity in enantioseparations by CE.
Sánchez-Hernández, Laura; García-Ruiz, Carmen; Luisa Marina, María; Luis Crego, Antonio
2010-01-01
This article reviews the latest methodological and instrumental improvements for enhancing sensitivity in chiral analysis by CE. The review covers literature from March 2007 until May 2009, that is, the works published after the appearance of the latest review article on the same topic by Sánchez-Hernández et al. [Electrophoresis 2008, 29, 237-251]. Off-line and on-line sample treatment techniques, on-line sample preconcentration strategies based on electrophoretic and chromatographic principles, and alternative detection systems to the widely employed UV/Vis detection in CE are the most relevant approaches discussed for improving sensitivity. Microchip technologies are also included since they can open up great possibilities to achieve sensitive and fast enantiomeric separations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poore III, Willis P; Belles, Randy; Mays, Gary T
This report summarizes the approach that ORNL developed for screening a sample set of US Department of Defense (DOD) military base sites and DOE sites for possible powering with an SMR; the methodology employed, including spatial modeling; and initial results for several sample sites. The objective in conducting this type of siting evaluation is demonstrate the capability to characterize specific DOD and DOE sites to identify any particular issues associated with powering the sites with an SMR using OR-SAGE; it is not intended to be a definitive assessment per se as to the absolute suitability of any particular site.
A land manager's guide to point counts of birds in the Southeast
Hamel, P.B.; Smith, W.P.; Twedt, D.J.; Woehr, J.R.; Morris, E.; Hamilton, R.B.; Cooper, R.J.
1996-01-01
Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of inventorial and monitoring surveys based on point counts, including discussion of sample size determination, distribution of counts among habitats, cooperation among neighboring land managers, vegetation sampling, standard data format, and other topics. Appendices provide additional information, making this guide a stand-alone text for managers interested in developing inventories of bird populations on their lands.
Quality and methodological challenges in Internet-based mental health trials.
Ye, Xibiao; Bapuji, Sunita Bayyavarapu; Winters, Shannon; Metge, Colleen; Raynard, Mellissa
2014-08-01
To review the quality of Internet-based mental health intervention studies and their methodological challenges. We searched multiple literature databases to identify relevant studies according to the Population, Interventions, Comparators, Outcomes, and Study Design framework. Two reviewers independently assessed selection bias, allocation bias, confounding bias, blinding, data collection methods, and withdrawals/dropouts, using the Quality Assessment Tool for Quantitative Studies. We rated each component as strong, moderate, or weak and assigned a global rating (strong, moderate, or weak) to each study. We discussed methodological issues related to the study quality. Of 122 studies included, 31 (25%), 44 (36%), and 47 (39%) were rated strong, moderate, and weak, respectively. Only five studies were rated strong for all of the six quality components (three of them were published by the same group). Lack of blinding, selection bias, and low adherence were the top three challenges in Internet-based mental health intervention studies. The overall quality of Internet-based mental health intervention needs to improve. In particular, studies need to improve sample selection, intervention allocation, and blinding.
Yuen, Hon K; Austin, Sarah L
2014-01-01
We describe the methodological quality of recent studies on instrument development and testing published in the American Journal of Occupational Therapy (AJOT). We conducted a systematic review using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist to appraise 48 articles on measurement properties of assessments for adults published in AJOT between 2009 and 2013. Most studies had adequate methodological quality in design and statistical analysis. Common methodological limitations included that methods used to examine internal consistency were not consistently linked to the theoretical constructs underpinning assessments; participants in some test-retest reliability studies were not stable during the interim period; and in several studies of reliability and convergent validity, sample sizes were inadequate. AJOT's dissemination of psychometric research evidence has made important contributions to moving the profession toward the American Occupational Therapy Association's Centennial Vision. This study's results provide a benchmark by which to evaluate future accomplishments. Copyright © 2014 by the American Occupational Therapy Association, Inc.
VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA
Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu
2009-01-01
We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190
The EURO-URHIS 2 project in Ho Chi Min City: contextual adequacy in cross-cultural research.
Steels, Stephanie Linawati
2016-03-01
The European Urban Health Indicators System Project Part 2 (EURO-URHIS 2) is a cross-national study that was implemented in Europe. It consists of four data collection tools that were specifically developed to collect health data at an urban level. This paper reviews some of the methodological constraints in adapting the EURO-URHIS 2 study in Ho Chi Minh City, Vietnam. No attempt to extend the original study beyond Europe has been reported before. Cultural, political, economic and social differences create specific obstacles as well as challenges. This paper sets out how these challenges were addressed, examining key aspects of the methodology, including study design, translation of the questionnaire and data collection. It was found that the EURO-URHIS 2 adult data collection tool methodology could not be replicated in Vietnam. A lack of basic infrastructure and population registers led to significant changes being made to the sampling and survey administration. It was recommended that the Expanded Programme on Immunization (EPI) was used as the replacement method. Despite the limitations in using the EPI method, the overall strengths and benefits were found to address methodological issues and the resource poor setting. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitfield, R.G; Biller, W.F.; Jusko, M.J.
1996-06-01
The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less
Molecular plant breeding: methodology and achievements.
Varshney, Rajeev K; Hoisington, Dave A; Nayak, Spurthi N; Graner, Andreas
2009-01-01
The progress made in DNA marker technology has been remarkable and exciting in recent years. DNA markers have proved valuable tools in various analyses in plant breeding, for example, early generation selection, enrichment of complex F(1)s, choice of donor parent in backcrossing, recovery of recurrent parent genotype in backcrossing, linkage block analysis and selection. Other main areas of applications of molecular markers in plant breeding include germplasm characterization/fingerprinting, determining seed purity, systematic sampling of germplasm, and phylogenetic analysis. Molecular markers, thus, have proved powerful tools in replacing the bioassays and there are now many examples available to show the efficacy of such markers. We have illustrated some basic concepts and methodology of applying molecular markers for enhancing the selection efficiency in plant breeding. Some successful examples of product developments of molecular breeding have also been presented.
Leigh, Barbara C.; Stall, Ron
2008-01-01
Recent reports have suggested that the use of alcohol or drugs is related to sexual behavior that is high-risk for HIV infection. If substance use leads to unsafe sexual activity, understanding the dynamics of this relationship can contribute to research, preventive and education efforts to contain the spread of AIDS. In this paper, we review research on the relationship between substance use and high-risk sexual behavior. We then consider the inherent limitations of the research designs used to study this relationship, outline some methodological concerns including measurement and sampling issues, and comment on causal interpretations of correlational research findings. We end with a consideration of potential avenues for avenues for future research and a discussion of implications of these findings for current AIDS prevention policies. PMID:8256876
Niccolai, Linda M.; Ogden, Lorraine G.; Muehlenbein, Catherine E.; Dziura, James D.; Vázquez, Marietta; Shapiro, Eugene D.
2007-01-01
Objective Case-control studies of the effectiveness of a vaccine are useful to answer important questions, such as the effectiveness of a vaccine over time, that usually are not addressed by pre-licensure clinical trials of the vaccine’s efficacy. This report describes methodological issues related to design and analysis that were used to determine the effects of time since vaccination and age at the time of vaccination. Study Design and Setting A matched case-control study of the effectiveness of varicella vaccine. Results Sampling procedures and conditional logistic regression models including interaction terms are described. Conclusion Use of these methods will allow investigators to assess the effects of a wide range of variables, such as time since vaccination and age at the time of vaccination, on the effectiveness of a vaccine. PMID:17938054
ERIC Educational Resources Information Center
Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry
2005-01-01
This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…
Optimizing Clinical Trial Enrollment Methods Through "Goal Programming"
Davis, J.M.; Sandgren, A.J.; Manley, A.R.; Daleo, M.A.; Smith, S.S.
2014-01-01
Introduction Clinical trials often fail to reach desired goals due to poor recruitment outcomes, including low participant turnout, high recruitment cost, or poor representation of minorities. At present, there is limited literature available to guide recruitment methodology. This study, conducted by researchers at the University of Wisconsin Center for Tobacco Research and Intervention (UW-CTRI), provides an example of how iterative analysis of recruitment data may be used to optimize recruitment outcomes during ongoing recruitment. Study methodology UW-CTRI’s research team provided a description of methods used to recruit smokers in two randomized trials (n = 196 and n = 175). The trials targeted low socioeconomic status (SES) smokers and involved time-intensive smoking cessation interventions. Primary recruitment goals were to meet required sample size and provide representative diversity while working with limited funds and limited time. Recruitment data was analyzed repeatedly throughout each study to optimize recruitment outcomes. Results Estimates of recruitment outcomes based on prior studies on smoking cessation suggested that researchers would be able to recruit 240 low SES smokers within 30 months at a cost of $72,000. With employment of methods described herein, researchers were able to recruit 374 low SES smokers over 30 months at a cost of $36,260. Discussion Each human subjects study presents unique recruitment challenges with time and cost of recruitment dependent on the sample population and study methodology. Nonetheless, researchers may be able to improve recruitment outcomes though iterative analysis of recruitment data and optimization of recruitment methods throughout the recruitment period. PMID:25642125
Persisting mathematics and science high school teachers: A Q-methodology study
NASA Astrophysics Data System (ADS)
Robbins-Lavicka, Michelle M.
There is a lack of qualified mathematics and science teachers at all levels of education in Arkansas. Lasting teaching initiative programs are needed to address retention so qualified teachers remain in the classroom. The dearth of studies regarding why mathematics and science teachers persist in the classroom beyond the traditional 5-year attrition period led this Q-methodological study to evaluate the subjective perceptions of persistent mathematics and science teachers to determine what makes them stay. This study sought to understand what factors persisting mathematics and science teachers used to explain their persistence in the classroom beyond 5 years and what educational factors contributed to persisting mathematics and science teachers. Q-methodology combines qualitative and quantitative techniques and provided a systematic means to investigate personal beliefs by collecting a concourse, developing a Q-sample and a person-sample, conducting a Q-sorting process, and analyzing the data. The results indicated that to encourage longevity within mathematics and science classrooms (a) teachers should remain cognizant of their ability to influence student attitudes toward teaching; (b) administrators should provide support for teachers and emphasize the role and importance of professional development; and (c) policy makers should focus their efforts and resources on developing recruitment plans, including mentorship programs, while providing and improving financial compensation. Significantly, the findings indicate that providing mentorship and role models at every level of mathematics and science education will likely encourage qualified teachers to remain in the mathematics and science classrooms, thus increasing the chance of positive social change.
Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A
2018-05-01
In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Brown, Robert D.; Gortmaker, Valerie J.
2009-01-01
Methodological and political issues arise during the designing, conducting, and reporting of campus-climate studies for LGBT students. These issues interact; making a decision about a methodological issue (e.g., sample size) has an impact on a political issue (e.g., how well the findings will be received). Ten key questions that must be addressed…
76 FR 62068 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-06
... methodological objectives. The first objective is to test the feasibility of the proposed sampling frame and to... minutes. Results of the methodological component of the feasibility study will be used to assess the...
NASA Technical Reports Server (NTRS)
Baker, T. C. (Principal Investigator)
1982-01-01
A general methodology is presented for estimating a stratum's at-harvest crop acreage proportion for a given crop year (target year) from the crop's estimated acreage proportion for sample segments from within the stratum. Sample segments from crop years other than the target year are (usually) required for use in conjunction with those from the target year. In addition, the stratum's (identifiable) crop acreage proportion may be estimated for times other than at-harvest in some situations. A by-product of the procedure is a methodology for estimating the change in the stratum's at-harvest crop acreage proportion from crop year to crop year. An implementation of the proposed procedure as a statistical analysis system routine using the system's matrix language module, PROC MATRIX, is described and documented. Three examples illustrating use of the methodology and algorithm are provided.
The Fungal Frontier: A Comparative Analysis of Methods Used in the Study of the Human Gut Mycobiome
Huseyin, Chloe E.; Rubio, Raul Cabrera; O’Sullivan, Orla; Cotter, Paul D.; Scanlan, Pauline D.
2017-01-01
The human gut is host to a diverse range of fungal species, collectively referred to as the gut “mycobiome”. The gut mycobiome is emerging as an area of considerable research interest due to the potential roles of these fungi in human health and disease. However, there is no consensus as to what the best or most suitable methodologies available are with respect to characterizing the human gut mycobiome. The aim of this study is to provide a comparative analysis of several previously published mycobiome-specific culture-dependent and -independent methodologies, including choice of culture media, incubation conditions (aerobic versus anaerobic), DNA extraction method, primer set and freezing of fecal samples to assess their relative merits and suitability for gut mycobiome analysis. There was no significant effect of media type or aeration on culture-dependent results. However, freezing was found to have a significant effect on fungal viability, with significantly lower fungal numbers recovered from frozen samples. DNA extraction method had a significant effect on DNA yield and quality. However, freezing and extraction method did not have any impact on either α or β diversity. There was also considerable variation in the ability of different fungal-specific primer sets to generate PCR products for subsequent sequence analysis. Through this investigation two DNA extraction methods and one primer set was identified which facilitated the analysis of the mycobiome for all samples in this study. Ultimately, a diverse range of fungal species were recovered using both approaches, with Candida and Saccharomyces identified as the most common fungal species recovered using culture-dependent and culture-independent methods, respectively. As has been apparent from ecological surveys of the bacterial fraction of the gut microbiota, the use of different methodologies can also impact on our understanding of gut mycobiome composition and therefore requires careful consideration. Future research into the gut mycobiome needs to adopt a common strategy to minimize potentially confounding effects of methodological choice and to facilitate comparative analysis of datasets. PMID:28824566
Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.
Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M
2018-06-13
This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.
Computer-aided testing of pilot response to critical in-flight events
NASA Technical Reports Server (NTRS)
Giffin, W. C.; Rockwell, T. H.
1984-01-01
This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.
Young, Rhea; Camic, Paul M; Tischler, Victoria
2016-01-01
Dementia is a progressive condition, affecting increasing numbers of people, characterised by cognitive decline. The current systematic review aimed to evaluate research pertaining to the impact of arts and health interventions on cognition in people with dementia. A literature search was conducted utilising PsychInfo, Cochrane Reviews, Web of Science, Medline and British Humanities Index databases. Seventeen studies were included in the review, including those related to literary, performing and visual arts. The review highlighted this as an emerging area of research with the literature consisting largely of small-scale studies with methodological limitations including lack of control groups and often poorly defined samples. All the studies suggested, however, that arts-based activities had a positive impact on cognitive processes, in particular on attention, stimulation of memories, enhanced communication and engagement with creative activities. The existent literature suggests that arts activities are helpful interventions within dementia care. A consensus has yet to emerge, however, about the direction for future research including the challenge of measurement and the importance of methodological flexibility. It is suggested that further research address some of these limitations by examining whether the impact of interventions vary depending on cognitive ability and to continue to assess how arts interventions can be of use across the stages of dementia.
Integrative data analysis in clinical psychology research.
Hussong, Andrea M; Curran, Patrick J; Bauer, Daniel J
2013-01-01
Integrative data analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology.
Integrative Data Analysis in Clinical Psychology Research
Hussong, Andrea M.; Curran, Patrick J.; Bauer, Daniel J.
2013-01-01
Integrative Data Analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology. PMID:23394226
High Throughput, Multiplexed Pathogen Detection Authenticates Plague Waves in Medieval Venice, Italy
Tran, Thi-Nguyen-Ny; Signoli, Michel; Fozzati, Luigi; Aboudharam, Gérard; Raoult, Didier; Drancourt, Michel
2011-01-01
Background Historical records suggest that multiple burial sites from the 14th–16th centuries in Venice, Italy, were used during the Black Death and subsequent plague epidemics. Methodology/Principal Findings High throughput, multiplexed real-time PCR detected DNA of seven highly transmissible pathogens in 173 dental pulp specimens collected from 46 graves. Bartonella quintana DNA was identified in five (2.9%) samples, including three from the 16th century and two from the 15th century, and Yersinia pestis DNA was detected in three (1.7%) samples, including two from the 14th century and one from the 16th century. Partial glpD gene sequencing indicated that the detected Y. pestis was the Orientalis biotype. Conclusions These data document for the first time successive plague epidemics in the medieval European city where quarantine was first instituted in the 14th century. PMID:21423736
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
NASA Astrophysics Data System (ADS)
Vaganov, A. V.; Zhestkov, B. E.; Lyamin, Yu. B.; Poilov, V. Z.; Pryamilova, E. N.
2016-10-01
The 12 ceramics samples of Ural Research Institute of Composite Materials were investigated in the wind tunnel VAT-104 of TsAGI in air plasma flow which simulated the hypervelocity flight. Model used were discs and blunted cones. All samples had withstood the tests without decomposition, the sample temperature and test time being respectively up to 2800 K and 1200 seconds. It was found there is a big delay in heating of the samples, thought they are of great thermo conductivity. A very interesting phenomenon, the formation of highly catalytic thermo barrier film on the front surface of sample, was also observed. It was a formation of this film that coursed a jump of 500-1000 K of surface temperature during the test. The sample catalytic activity was evaluated using modernized methodology based upon parametrical numerical simulation.
Knopman, Debra S.; Voss, Clifford I.; Garabedian, Stephen P.
1991-01-01
Tests of a one-dimensional sampling design methodology on measurements of bromide concentration collected during the natural gradient tracer test conducted by the U.S. Geological Survey on Cape Cod, Massachusetts, demonstrate its efficacy for field studies of solute transport in groundwater and the utility of one-dimensional analysis. The methodology was applied to design of sparse two-dimensional networks of fully screened wells typical of those often used in engineering practice. In one-dimensional analysis, designs consist of the downstream distances to rows of wells oriented perpendicular to the groundwater flow direction and the timing of sampling to be carried out on each row. The power of a sampling design is measured by its effectiveness in simultaneously meeting objectives of model discrimination, parameter estimation, and cost minimization. One-dimensional models of solute transport, differing in processes affecting the solute and assumptions about the structure of the flow field, were considered for description of tracer cloud migration. When fitting each model using nonlinear regression, additive and multiplicative error forms were allowed for the residuals which consist of both random and model errors. The one-dimensional single-layer model of a nonreactive solute with multiplicative error was judged to be the best of those tested. Results show the efficacy of the methodology in designing sparse but powerful sampling networks. Designs that sample five rows of wells at five or fewer times in any given row performed as well for model discrimination as the full set of samples taken up to eight times in a given row from as many as 89 rows. Also, designs for parameter estimation judged to be good by the methodology were as effective in reducing the variance of parameter estimates as arbitrary designs with many more samples. Results further showed that estimates of velocity and longitudinal dispersivity in one-dimensional models based on data from only five rows of fully screened wells each sampled five or fewer times were practically equivalent to values determined from moments analysis of the complete three-dimensional set of 29,285 samples taken during 16 sampling times.
Choi, Yoonha; Liu, Tiffany Ting; Pankratz, Daniel G; Colby, Thomas V; Barth, Neil M; Lynch, David A; Walsh, P Sean; Raghu, Ganesh; Kennedy, Giulia C; Huang, Jing
2018-05-09
We developed a classifier using RNA sequencing data that identifies the usual interstitial pneumonia (UIP) pattern for the diagnosis of idiopathic pulmonary fibrosis. We addressed significant challenges, including limited sample size, biological and technical sample heterogeneity, and reagent and assay batch effects. We identified inter- and intra-patient heterogeneity, particularly within the non-UIP group. The models classified UIP on transbronchial biopsy samples with a receiver-operating characteristic area under the curve of ~ 0.9 in cross-validation. Using in silico mixed samples in training, we prospectively defined a decision boundary to optimize specificity at ≥85%. The penalized logistic regression model showed greater reproducibility across technical replicates and was chosen as the final model. The final model showed sensitivity of 70% and specificity of 88% in the test set. We demonstrated that the suggested methodologies appropriately addressed challenges of the sample size, disease heterogeneity and technical batch effects and developed a highly accurate and robust classifier leveraging RNA sequencing for the classification of UIP.
Contemporary Impact Analysis Methodology for Planetary Sample Return Missions
NASA Technical Reports Server (NTRS)
Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.
2015-01-01
Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.
da Silva Filho, Manoel; Santos, Daniel Valle Vasconcelos; Costa, Kauê Machado
2013-01-01
Analyzing cell morphology is crucial in the fields of cell biology and neuroscience. One of the main methods for evaluating cell morphology is by using intracellular fluorescent markers, including various commercially available dyes and genetically encoded fluorescent proteins. These markers can be used as free radical sources in photooxidation reactions, which in the presence of diaminobenzidine (DAB) forms an opaque and electron-dense precipitate that remains localized within the cellular and organelle membranes. This method confers many methodological advantages for the investigator, including absence of photo-bleaching, high visual contrast and the possibility of correlating optical imaging with electron microscopy. However, current photooxidation techniques require the continuous use of fluorescent or confocal microscopes, which wastes valuable mercury lamp lifetime and limits the conversion process to a few cells at a time. We developed a low cost optical apparatus for performing photooxidation reactions and propose a new procedure that solves these methodological restrictions. Our “photooxidizer” consists of a high power light emitting diode (LED) associated with a custom aluminum and acrylic case and a microchip-controlled current source. We demonstrate the efficacy of our method by converting intracellular DiI in samples of developing rat neocortex and post-mortem human retina. DiI crystals were inserted in the tissue and allowed to diffuse for 20 days. The samples were then processed with the new photooxidation technique and analyzed under optical microscopy. The results show that our protocols can unveil the fine morphology of neurons in detail. Cellular structures such as axons, dendrites and spine-like appendages were well defined. In addition to its low cost, simplicity and reliability, our method precludes the use of microscope lamps for photooxidation and allows the processing of many labeled cells simultaneously in relatively large tissue samples with high efficacy. PMID:23441199
Effect of maternal body mass index on hormones in breast milk: a systematic review.
Andreas, Nicholas J; Hyde, Matthew J; Gale, Chris; Parkinson, James R C; Jeffries, Suzan; Holmes, Elaine; Modi, Neena
2014-01-01
Maternal Body Mass Index (BMI) is positively associated with infant obesity risk. Breast milk contains a number of hormones that may influence infant metabolism during the neonatal period; these may have additional downstream effects on infant appetite regulatory pathways, thereby influencing propensity towards obesity in later life. To conduct a systematic review of studies examining the association between maternal BMI and the concentration of appetite-regulating hormones in breast milk. Pubmed was searched for studies reporting the association between maternal BMI and leptin, adiponectin, insulin, ghrelin, resistin, obestatin, Peptide YY and Glucagon-Like Peptide 1 in breast milk. Twenty six studies were identified and included in the systematic review. There was a high degree of variability between studies with regard to collection, preparation and analysis of breast milk samples. Eleven of fifteen studies reporting breast milk leptin found a positive association between maternal BMI and milk leptin concentration. Two of nine studies investigating adiponectin found an association between maternal BMI and breast milk adiponectin concentration; however significance was lost in one study following adjustment for time post-partum. No association was seen between maternal BMI and milk adiponectin in the other seven studies identified. Evidence for an association between other appetite regulating hormones and maternal BMI was either inconclusive, or lacking. A positive association between maternal BMI and breast milk leptin concentration is consistently found in most studies, despite variable methodology. Evidence for such an association with breast milk adiponectin concentration, however, is lacking with additional research needed for other hormones including insulin, ghrelin, resistin, obestatin, peptide YY and glucagon-like peptide-1. As most current studies have been conducted with small sample sizes, future studies should ensure adequate sample sizes and standardized methodology.
Alzheimer's disease and diet: a systematic review.
Yusufov, Miryam; Weyandt, Lisa L; Piryatinsky, Irene
2017-02-01
Purpose/Aim: Approximately 44 million people worldwide have Alzheimer's disease (AD). Numerous claims have been made regarding the influence of diet on AD development. The aims of this systematic review were to summarize the evidence considering diet as a protective or risk factor for AD, identify methodological challenges and limitations, and provide future research directions. Medline, PsycINFO and PsycARTICLES were searched for articles that examined the relationship between diet and AD. On the basis of the inclusion and exclusion criteria, 64 studies were included, generating a total of 141 dietary patterns or "models". All studies were published between 1997 and 2015, with a total of 132 491 participants. Twelve studies examined the relationship between a Mediterranean (MeDi) diet and AD development, 10 of which revealed a significant association. Findings were inconsistent with respect to sample size, AD diagnosis and food measures. Further, the majority of studies (81.3%) included samples with mean baseline ages that were at risk for AD based on age (>65 years), ranging from 52.0 to 85.4 years. The range of follow-up periods was 1.5-32.0 years. The mean age of the samples poses a limitation in determining the influence of diet on AD; given that AD has a long prodromal phase prior to the manifestation of symptoms and decline. Further studies are necessary to determine whether diet is a risk or protective factor for AD, foster translation of research into clinical practice and elucidate dietary recommendations. Despite the methodological limitations, the finding that 50 of the 64 reviewed studies revealed an association between diet and AD incidence offers promising implications for diet as a modifiable risk factor for AD.
Lachmann, Bernd; Sariyska, Rayna; Kannen, Christopher; Błaszkiewicz, Konrad; Trendafilov, Boris; Andone, Ionut; Eibes, Mark; Markowetz, Alexander; Li, Mei; Kendrick, Keith M.
2017-01-01
Virtually everybody would agree that life satisfaction is of immense importance in everyday life. Thus, it is not surprising that a considerable amount of research using many different methodological approaches has investigated what the best predictors of life satisfaction are. In the present study, we have focused on several key potential influences on life satisfaction including bottom-up and top-down models, cross-cultural effects, and demographic variables. In four independent (large scale) surveys with sample sizes ranging from N = 488 to 40,297, we examined the associations between life satisfaction and various related variables. Our findings demonstrate that prediction of overall life satisfaction works best when including information about specific life satisfaction variables. From this perspective, satisfaction with leisure showed the highest impact on overall life satisfaction in our European samples. Personality was also robustly associated with life satisfaction, but only when life satisfaction variables were not included in the regression model. These findings could be replicated in all four independent samples, but it was also demonstrated that the relevance of life satisfaction variables changed under the influence of cross-cultural effects. PMID:29295529
Chemical and toxicologic assessment of organic contaminants in surface water using passive samplers
Alvarez, D.A.; Cranor, W.L.; Perkins, S.D.; Clark, R.C.; Smith, S.B.
2008-01-01
Passive sampling methodologies were used to conduct a chemical and toxicologic assessment of organic contaminants in the surface waters of three geographically distinct agricultural watersheds. A selection of current-use agrochemicals and persistent organic pollutants, including polycyclic aromatic hydrocarbons, polychlorinated biphenyls, and organochlorine pesticides, were targeted using the polar organic chemical integrative sampler (POCIS) and the semipermeable membrane device passive samplers. In addition to the chemical analysis, the Microtox assay for acute toxicity and the yeast estrogen screen (YES) were conducted as potential assessment tools in combination with the passive samplers. During the spring of 2004, the passive samplers were deployed for 29 to 65 d at Leary Weber Ditch, IN; Morgan Creek, MD; and DR2 Drain, WA. Chemical analysis of the sampler extracts identified the agrochemicals predominantly used in those areas, including atrazine, simazine, acetochlor, and metolachlor. Other chemicals identified included deethylatrazine and deisopropylatrazine, trifluralin, fluoranthene, pyrene, cis- and trans-nonachlor, and pentachloroanisole. Screening using Microtox resulted in no acutely toxic samples. POCIS samples screened by the YES assay failed to elicit a positive estrogenic response. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
Repp, Kimberly K; Hawes, Eva; Rees, Kathleen J; Vorderstrasse, Beth; Mohnkern, Sue
2018-06-07
Conducting a large-scale Community Assessment for Public Health Emergency Response (CASPER) in a geographically and linguistically diverse county presents significant methodological challenges that require advance planning. The Centers for Disease Control and Prevention (CDC) has adapted methodology and provided a toolkit for a rapid needs assessment after a disaster. The assessment provides representative data of the sampling frame to help guide effective distribution of resources. This article describes methodological considerations and lessons learned from a CASPER exercise conducted by Washington County Public Health in June 2016 to assess community emergency preparedness. The CDC's CASPER toolkit provides detailed guidance for exercises in urban areas where city blocks are well defined with many single family homes. Converting the exercise to include rural areas with challenging geographical terrain, including accessing homes without public roads, required considerable adjustments in planning. Adequate preparations for vulnerable populations with English linguistic barriers required additional significant resources. Lessons learned are presented from the first countywide CASPER exercise in Oregon. Approximately 61% of interviews were completed, and 85% of volunteers reported they would participate in another CASPER exercise. Results from the emergency preparedness survey will be presented elsewhere. This experience indicates the most important considerations for conducting a CASPER exercise are oversampling clusters, overrecruiting volunteers, anticipating the actual cost of staff time, and ensuring timely language services are available during the event.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less
Variability estimation of urban wastewater biodegradable fractions by respirometry.
Lagarde, Fabienne; Tusseau-Vuillemin, Marie-Hélène; Lessard, Paul; Héduit, Alain; Dutrop, François; Mouchel, Jean-Marie
2005-11-01
This paper presents a methodology for assessing the variability of biodegradable chemical oxygen demand (COD) fractions in urban wastewaters. Thirteen raw wastewater samples from combined and separate sewers feeding the same plant were characterised, and two optimisation procedures were applied in order to evaluate the variability in biodegradable fractions and related kinetic parameters. Through an overall optimisation on all the samples, a unique kinetic parameter set was obtained with a three-substrate model including an adsorption stage. This method required powerful numerical treatment, but improved the identifiability problem compared to the usual sample-to-sample optimisation. The results showed that the fractionation of samples collected in the combined sewer was much more variable (standard deviation of 70% of the mean values) than the fractionation of the separate sewer samples, and the slowly biodegradable COD fraction was the most significant fraction (45% of the total COD on average). Because these samples were collected under various rain conditions, the standard deviations obtained here on the combined sewer biodegradable fractions could be used as a first estimation of the variability of this type of sewer system.
NASA Technical Reports Server (NTRS)
1971-01-01
Appendixes are presented that provide model input requirements, a sample case, flow charts, and a program listing. At the beginning of each appendix, descriptive details and technical comments are provided to indicate any special instructions applicable to the use of that appendix. In addition, the program listing includes comment cards that state the purpose of each subroutine in the complete program and describe operations performed within that subroutine. The input requirements includes details on the many options that adapt the program to the specific needs of the analyst for a particular problem.
Conaway, Christopher; Thordsen, James J.; Manning, Michael A.; Cook, Paul J.; Trautz, Robert C.; Thomas, Burt; Kharaka, Yousif K.
2016-01-01
The chemical composition of formation water and associated gases from the lower Cretaceous Paluxy Formation was determined using four different sampling methods at a characterization well in the Citronelle Oil Field, Alabama, as part of the Southeast Regional Carbon Sequestration Partnership (SECARB) Phase III Anthropogenic Test, which is an integrated carbon capture and storage project. In this study, formation water and gas samples were obtained from well D-9-8 #2 at Citronelle using gas lift, electric submersible pump, U-tube, and a downhole vacuum sampler (VS) and subjected to both field and laboratory analyses. Field chemical analyses included electrical conductivity, dissolved sulfide concentration, alkalinity, and pH; laboratory analyses included major, minor and trace elements, dissolved carbon, volatile fatty acids, free and dissolved gas species. The formation water obtained from this well is a Na–Ca–Cl-type brine with a salinity of about 200,000 mg/L total dissolved solids. Differences were evident between sampling methodologies, particularly in pH, Fe and alkalinity. There was little gas in samples, and gas composition results were strongly influenced by sampling methods. The results of the comparison demonstrate the difficulty and importance of preserving volatile analytes in samples, with the VS and U-tube system performing most favorably in this aspect.
Evaluation of glucose controllers in virtual environment: methodology and sample application.
Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman
2004-11-01
Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.
Shevchenko, V E; Arnotskaia, N E; Ogorodnikova, E V; Davydov, M M; Ibraev, M A; Turkin, I N; Davydov, M I
2014-01-01
Gastric cancer, one of the most widespread malignant tumors, still lacks reliable serum/plasma biomarkers of its early detection. In this study we have developed, unified, and tested a new methodology for search of gastric cancer biomarkers based on profiling of low molecular weight proteome (LMWP) (1-17 kDa). This approach included three main components: sample pre-fractionation, matrix-assisted laser desorption ionization time of flight mass spectrometry (MALDI-TOF-MS), data analysis by a bioinformatics software package. Applicability and perspectives of the developed approach for detection of potential gastric cancer markers during LMWP analysis have been demonstrated using 69 plasma samples from patients with gastric cancer (stages I-IV) and 238 control samples. The study revealed peptides/polypeptides, which may be potentially used for detection of this pathology.
Enhanced sampling techniques in biomolecular simulations.
Spiwok, Vojtech; Sucur, Zoran; Hosek, Petr
2015-11-01
Biomolecular simulations are routinely used in biochemistry and molecular biology research; however, they often fail to match expectations of their impact on pharmaceutical and biotech industry. This is caused by the fact that a vast amount of computer time is required to simulate short episodes from the life of biomolecules. Several approaches have been developed to overcome this obstacle, including application of massively parallel and special purpose computers or non-conventional hardware. Methodological approaches are represented by coarse-grained models and enhanced sampling techniques. These techniques can show how the studied system behaves in long time-scales on the basis of relatively short simulations. This review presents an overview of new simulation approaches, the theory behind enhanced sampling methods and success stories of their applications with a direct impact on biotechnology or drug design. Copyright © 2014 Elsevier Inc. All rights reserved.
Mashile, Geaneth Pertunia; Nomngongo, Philiswa N
2017-03-04
Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.
Martin, James; Taljaard, Monica; Girling, Alan; Hemming, Karla
2016-01-01
Background Stepped-wedge cluster randomised trials (SW-CRT) are increasingly being used in health policy and services research, but unless they are conducted and reported to the highest methodological standards, they are unlikely to be useful to decision-makers. Sample size calculations for these designs require allowance for clustering, time effects and repeated measures. Methods We carried out a methodological review of SW-CRTs up to October 2014. We assessed adherence to reporting each of the 9 sample size calculation items recommended in the 2012 extension of the CONSORT statement to cluster trials. Results We identified 32 completed trials and 28 independent protocols published between 1987 and 2014. Of these, 45 (75%) reported a sample size calculation, with a median of 5.0 (IQR 2.5–6.0) of the 9 CONSORT items reported. Of those that reported a sample size calculation, the majority, 33 (73%), allowed for clustering, but just 15 (33%) allowed for time effects. There was a small increase in the proportions reporting a sample size calculation (from 64% before to 84% after publication of the CONSORT extension, p=0.07). The type of design (cohort or cross-sectional) was not reported clearly in the majority of studies, but cohort designs seemed to be most prevalent. Sample size calculations in cohort designs were particularly poor with only 3 out of 24 (13%) of these studies allowing for repeated measures. Discussion The quality of reporting of sample size items in stepped-wedge trials is suboptimal. There is an urgent need for dissemination of the appropriate guidelines for reporting and methodological development to match the proliferation of the use of this design in practice. Time effects and repeated measures should be considered in all SW-CRT power calculations, and there should be clarity in reporting trials as cohort or cross-sectional designs. PMID:26846897
Pla-Tolós, J; Serra-Mora, P; Hakobyan, L; Molins-Legua, C; Moliner-Martinez, Y; Campins-Falcó, P
2016-11-01
In this work, in-tube solid phase microextraction (in-tube SPME) coupled to capillary LC (CapLC) with diode array detection has been reported, for on-line extraction and enrichment of booster biocides (irgarol-1051 and diuron) included in Water Frame Directive 2013/39/UE (WFD). The analytical performance has been successfully demonstrated. Furthermore, in the present work, the environmental friendliness of the procedure has been quantified by means of the implementation of the carbon footprint calculation of the analytical procedure and the comparison with other methodologies previously reported. Under the optimum conditions, the method presents good linearity over the range assayed, 0.05-10μg/L for irgarol-1051 and 0.7-10μg/L for diuron. The LODs were 0.015μg/L and 0.2μg/L for irgarol-1051 and diuron, respectively. Precision was also satisfactory (relative standard deviation, RSD<3.5%). The proposed methodology was applied to monitor water samples, taking into account the EQS standards for these compounds. The carbon footprint values for the proposed procedure consolidate the operational efficiency (analytical and environmental performance) of in-tube SPME-CapLC-DAD, in general, and in particular for determining irgarol-1051 and diuron in water samples. Copyright © 2016 Elsevier B.V. All rights reserved.
RAS testing in metastatic colorectal cancer: advances in Europe.
Van Krieken, J Han J M; Rouleau, Etienne; Ligtenberg, Marjolijn J L; Normanno, Nicola; Patterson, Scott D; Jung, Andreas
2016-04-01
Personalized medicine shows promise for maximizing efficacy and minimizing toxicity of anti-cancer treatment. KRAS exon 2 mutations are predictive of resistance to epidermal growth factor receptor-directed monoclonal antibodies in patients with metastatic colorectal cancer. Recent studies have shown that broader RAS testing (KRAS and NRAS) is needed to select patients for treatment. While Sanger sequencing is still used, approaches based on various methodologies are available. Few CE-approved kits, however, detect the full spectrum of RAS mutations. More recently, "next-generation" sequencing has been developed for research use, including parallel semiconductor sequencing and reversible termination. These techniques have high technical sensitivities for detecting mutations, although the ideal threshold is currently unknown. Finally, liquid biopsy has the potential to become an additional tool to assess tumor-derived DNA. For accurate and timely RAS testing, appropriate sampling and prompt delivery of material is critical. Processes to ensure efficient turnaround from sample request to RAS evaluation must be implemented so that patients receive the most appropriate treatment. Given the variety of methodologies, external quality assurance programs are important to ensure a high standard of RAS testing. Here, we review technical and practical aspects of RAS testing for pathologists working with metastatic colorectal cancer tumor samples. The extension of markers from KRAS to RAS testing is the new paradigm for biomarker testing in colorectal cancer.
NASA Technical Reports Server (NTRS)
Birmele, Michele
2012-01-01
The International Space Station (ISS) is a closed environment wih rotations of crew and equipment each introducing their own microbial flora making it necessary to monitor the air, surfaces, and water for microbial contamination. Current microbial monitoring includes labor and time intensive methods to enumerate total bacterial and fungal cells with limited characterization during in-flight testing. Although this culture-based method has been sufficient for monitoring the ISS, future long duration missions will need to perform more comprehensive characterization in-flight, since sample return and ground characterization may not be available. A workshop was held in 2011 at the Johnson Space Center to discuss alternative methodologies and technologies suitable for microbial monitoring for these longterm exploration missions where molecular-based methodologies, such as polymerase chain reaction (PCR), were recommended. In response, a multi-center (Marshall Space Flight Center, Johnson Space Center, Jet Propulsion Laboratory, and Kennedy Space Center) collaborative research effort was initiated to explore novel commercial-off-the-shelf hardware options for spaceflight environmental monitoring. The goal was to evaluate quantitative/semi-quantitative PCR approaches to space applications for low cost in-flight rapid identification of microorganisms affecting crew safety. The initial phase of this project identified commercially available platforms that could be minimally modified to perform nominally in microgravity followed by proof-of-concept testing on the highest qualifying candidates with a universally available test organism, Salmonella enterica. The platforms evaluated during proof-of-concept testing included the iCubate 2.0(TradeMark) (iCubate, Huntsville, AL), RAZOR EX (BioFire Diagnostics; Salt Lake City, Utah) and SmartCycler(TradeMark) (Cepheid; Sunnyvale, CA). The analysis identified two potential technologies (iCubate 2.0 and RAZOR EX) that were able to perform sample-to-answer testing with cell sample concentrations between SO to 400 cells. In addition, the commercial systems were evaluated for initial flight safety and readiness, sample concentration needs were reviewed, and a competitive procurement of commercially available platforms was initiated.
Wijngaarden, Peter Van; Keel, Stuart; Hodgson, Lauren A B; Kumar, Dinesh K; Aliahmad, Behzad; Paim, Cistiane C; Kiely, Kim M; Cherbuin, Nicolas; Anstey, Kaarin J; Dirani, Mohamed
2017-01-01
To describe the methodology and present the retinal grading findings of an older sample of australians with well-defined indices of neurocognitive function in the Personality and total Health (PATH) through life project. A cross-sectional study. Three hundred twenty-six individuals from the PatH through life project were invited to participate. Participants completed a general questionnaire and 2-field, 45-degree nonmydriatic color digital retinal photography. Photographs were graded for retinal pathology according to established protocols. Two hundred fifty-four (77.9%) subjects, aged 72 to 78 years, agreed to participate in the eye substudy. gradable images of at least 1 eye were acquired in 211 of 254 subjects (83.1%). retinal photographic screening identified 1 or more signs of pathology in 130 of the 174 subjects (74.7%) with gradable images of both eyes. a total of 45 participants (17.7%) had self-reported diabetes and diabetic retinopathy was observed in 22 (48.9%) of these participants. This well-defined sample of older australians provides a unique opportunity to interrogate associations between retinal findings, including retinal vascular geometric parameters, and indices of neurocognitive function. Copyright 2017 Asia-Pacific Academy of Ophthalmology.
Utility-based designs for randomized comparative trials with categorical outcomes
Murray, Thomas A.; Thall, Peter F.; Yuan, Ying
2016-01-01
A general utility-based testing methodology for design and conduct of randomized comparative clinical trials with categorical outcomes is presented. Numerical utilities of all elementary events are elicited to quantify their desirabilities. These numerical values are used to map the categorical outcome probability vector of each treatment to a mean utility, which is used as a one-dimensional criterion for constructing comparative tests. Bayesian tests are presented, including fixed sample and group sequential procedures, assuming Dirichlet-multinomial models for the priors and likelihoods. Guidelines are provided for establishing priors, eliciting utilities, and specifying hypotheses. Efficient posterior computation is discussed, and algorithms are provided for jointly calibrating test cutoffs and sample size to control overall type I error and achieve specified power. Asymptotic approximations for the power curve are used to initialize the algorithms. The methodology is applied to re-design a completed trial that compared two chemotherapy regimens for chronic lymphocytic leukemia, in which an ordinal efficacy outcome was dichotomized and toxicity was ignored to construct the trial’s design. The Bayesian tests also are illustrated by several types of categorical outcomes arising in common clinical settings. Freely available computer software for implementation is provided. PMID:27189672
Home and health in the third age - methodological background and descriptive findings.
Kylén, Maya; Ekström, Henrik; Haak, Maria; Elmståhl, Sölve; Iwarsson, Susanne
2014-07-11
The understanding of the complex relationship between the home environment, well-being and daily functioning in the third age is currently weak. The aim of this paper is to present the methodological background of the Home and Health in the Third Age Study, and describe a sample of men and women in relation to their home and health situation. The study sample included 371 people aged 67-70, living in ordinary housing in the south of Sweden. Structured interviews and observations were conducted to collect data about objective and perceived aspects of home and health. The majority of the participants were in good health and had few functional limitations. Women had more functional limitations and reported more symptoms than men. Environmental barriers were found in every home investigated; the most were found in the kitchen and hygiene area. Environmental barriers were more common in multi-family than in one-family dwellings. This study will increase our knowledge on home and health dynamics among people in the third age. The results have potential to contribute to societal planning related to housing provision, home care and social services for senior citizens.
Home and Health in the Third Age — Methodological Background and Descriptive Findings
Kylén, Maya; Ekström, Henrik; Haak, Maria; Elmståhl, Sölve; Iwarsson, Susanne
2014-01-01
Background: The understanding of the complex relationship between the home environment, well-being and daily functioning in the third age is currently weak. The aim of this paper is to present the methodological background of the Home and Health in the Third Age Study, and describe a sample of men and women in relation to their home and health situation. Methods and Design: The study sample included 371 people aged 67–70, living in ordinary housing in the south of Sweden. Structured interviews and observations were conducted to collect data about objective and perceived aspects of home and health. Results: The majority of the participants were in good health and had few functional limitations. Women had more functional limitations and reported more symptoms than men. Environmental barriers were found in every home investigated; the most were found in the kitchen and hygiene area. Environmental barriers were more common in multi-family than in one-family dwellings. Discussion: This study will increase our knowledge on home and health dynamics among people in the third age. The results have potential to contribute to societal planning related to housing provision, home care and social services for senior citizens. PMID:25019267
Jack, Allison; Pelphrey, Kevin
2017-01-01
Background Autism spectrum disorders (ASDs) are a heterogeneous group of neurodevelopmental conditions that vary in both etiology and phenotypic expression. Expressions of ASD characterized by a more severe phenotype, including autism with intellectual disability (ASD+ID), autism with a history of developmental regression (ASD+R), and minimally verbal autism (ASD+MV) are understudied generally, and especially in the domain of neuroimaging. However, neuroimaging methods are a potentially powerful tool for understanding the etiology of these ASD subtypes. Scope and Methodology This review evaluates existing neuroimaging research on ASD+MV, ASD+ID, and ASD+R, identified by a search of the literature using the PubMed database, and discusses methodological, theoretical, and practical considerations for future research involving neuroimaging assessment of these populations. Findings There is a paucity of neuroimaging research on ASD+ID, ASD+MV, and ASD+R, and what findings do exist are often contradictory, or so sparse as to be ungeneralizable. We suggest that while greater sample sizes and more studies are necessary, more important would be a paradigm shift toward multimodal (e.g., imaging genetics) approaches that allow for the characterization of heterogeneity within etiologically diverse samples. PMID:28102566
Inferring Molecular Processes Heterogeneity from Transcriptional Data.
Gogolewski, Krzysztof; Wronowska, Weronika; Lech, Agnieszka; Lesyng, Bogdan; Gambin, Anna
2017-01-01
RNA microarrays and RNA-seq are nowadays standard technologies to study the transcriptional activity of cells. Most studies focus on tracking transcriptional changes caused by specific experimental conditions. Information referring to genes up- and downregulation is evaluated analyzing the behaviour of relatively large population of cells by averaging its properties. However, even assuming perfect sample homogeneity, different subpopulations of cells can exhibit diverse transcriptomic profiles, as they may follow different regulatory/signaling pathways. The purpose of this study is to provide a novel methodological scheme to account for possible internal, functional heterogeneity in homogeneous cell lines, including cancer ones. We propose a novel computational method to infer the proportion between subpopulations of cells that manifest various functional behaviour in a given sample. Our method was validated using two datasets from RNA microarray experiments. Both experiments aimed to examine cell viability in specific experimental conditions. The presented methodology can be easily extended to RNA-seq data as well as other molecular processes. Moreover, it complements standard tools to indicate most important networks from transcriptomic data and in particular could be useful in the analysis of cancer cell lines affected by biologically active compounds or drugs.
Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A
2014-05-19
Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.
Inferring Molecular Processes Heterogeneity from Transcriptional Data
Wronowska, Weronika; Lesyng, Bogdan; Gambin, Anna
2017-01-01
RNA microarrays and RNA-seq are nowadays standard technologies to study the transcriptional activity of cells. Most studies focus on tracking transcriptional changes caused by specific experimental conditions. Information referring to genes up- and downregulation is evaluated analyzing the behaviour of relatively large population of cells by averaging its properties. However, even assuming perfect sample homogeneity, different subpopulations of cells can exhibit diverse transcriptomic profiles, as they may follow different regulatory/signaling pathways. The purpose of this study is to provide a novel methodological scheme to account for possible internal, functional heterogeneity in homogeneous cell lines, including cancer ones. We propose a novel computational method to infer the proportion between subpopulations of cells that manifest various functional behaviour in a given sample. Our method was validated using two datasets from RNA microarray experiments. Both experiments aimed to examine cell viability in specific experimental conditions. The presented methodology can be easily extended to RNA-seq data as well as other molecular processes. Moreover, it complements standard tools to indicate most important networks from transcriptomic data and in particular could be useful in the analysis of cancer cell lines affected by biologically active compounds or drugs. PMID:29362714
Hydrogen leak detection using laser-induced breakdown spectroscopy.
Ball, A J; Hohreiter, V; Hahn, D W
2005-03-01
Laser-induced breakdown spectroscopy (LIBS) is investigated as a technique for real-time monitoring of hydrogen gas. Two methodologies were examined: The use of a 100 mJ laser pulse to create a laser-induced breakdown directly in a sample gas stream, and the use of a 55 mJ laser pulse to create a laser-induced plasma on a solid substrate surface, with the expanding plasma sampling the gas stream. Various metals were analyzed as candidate substrate surfaces, including aluminum, copper, molybdenum, stainless steel, titanium, and tungsten. Stainless steel was selected, and a detailed analysis of hydrogen detection in binary mixtures of nitrogen and hydrogen at atmospheric pressure was performed. Both the gaseous plasma and the plasma initiated on the stainless steel surface generated comparable hydrogen emission signals, using the 656.28 Halpha emission line, and exhibited excellent signal linearity. The limit of detection is about 20 ppm (mass) as determined for both methodologies, with the solid-initiated plasma yielding a slightly better value. Overall, LIBS is concluded to be a viable candidate for hydrogen sensing, offering a combination of high sensitivity with a technique that is well suited to implementation in field environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Jim Bouchard
Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less
Sikirzhytski, Vitali; Sikirzhytskaya, Aliaksandra; Lednev, Igor K
2012-10-10
Conventional confirmatory biochemical tests used in the forensic analysis of body fluid traces found at a crime scene are destructive and not universal. Recently, we reported on the application of near-infrared (NIR) Raman microspectroscopy for non-destructive confirmatory identification of pure blood, saliva, semen, vaginal fluid and sweat. Here we expand the method to include dry mixtures of semen and blood. A classification algorithm was developed for differentiating pure body fluids and their mixtures. The classification methodology is based on an effective combination of Support Vector Machine (SVM) regression (data selection) and SVM Discriminant Analysis of preprocessed experimental Raman spectra collected using an automatic mapping of the sample. This extensive cross-validation of the obtained results demonstrated that the detection limit of the minor contributor is as low as a few percent. The developed methodology can be further expanded to any binary mixture of complex solutions, including but not limited to mixtures of other body fluids. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Effects-Based Operations in the Cyber Domain
2017-05-03
as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based
ERIC Educational Resources Information Center
Brino, Ana Leda F., Barros, Romariz S., Galvao, Ol; Garotti, M.; Da Cruz, Ilara R. N.; Santos, Jose R.; Dube, William V.; McIlvane, William J.
2011-01-01
This paper reports use of sample stimulus control shaping procedures to teach arbitrary matching-to-sample to 2 capuchin monkeys ("Cebus apella"). The procedures started with identity matching-to-sample. During shaping, stimulus features of the sample were altered gradually, rendering samples and comparisons increasingly physically dissimilar. The…
Prevalence of hypertension among adolescents: systematic review and meta-analysis.
Gonçalves, Vivian Siqueira Santos; Galvão, Taís Freire; de Andrade, Keitty Regina Cordeiro; Dutra, Eliane Said; Bertolin, Maria Natacha Toral; de Carvalho, Kenia Mara Baiocchi; Pereira, Mauricio Gomes
2016-01-01
To estimate the prevalence of hypertension among adolescent Brazilian students. A systematic review of school-based cross-sectional studies was conducted. The articles were searched in the databases MEDLINE, Embase, Scopus, LILACS, SciELO, Web of Science, CAPES thesis database and Trip Database. In addition, we examined the lists of references of relevant studies to identify potentially eligible articles. No restrictions regarding publication date, language, or status applied. The studies were selected by two independent evaluators, who also extracted the data and assessed the methodological quality following eight criteria related to sampling, measuring blood pressure, and presenting results. The meta-analysis was calculated using a random effects model and analyses were performed to investigate heterogeneity. We retrieved 1,577 articles from the search and included 22 in the review. The included articles corresponded to 14,115 adolescents, 51.2% (n = 7,230) female. We observed a variety of techniques, equipment, and references used. The prevalence of hypertension was 8.0% (95%CI 5.0-11.0; I2 = 97.6%), 9.3% (95%CI 5.6-13.6; I2 = 96.4%) in males and 6.5% (95%CI 4.2-9.1; I2 = 94.2%) in females. The meta-regression failed to identify the causes of the heterogeneity among studies. Despite the differences found in the methodologies of the included studies, the results of this systematic review indicate that hypertension is prevalent in the Brazilian adolescent school population. For future investigations, we suggest the standardization of techniques, equipment, and references, aiming at improving the methodological quality of the studies.
Methodological review of the quality of reach out and read: does it "work"?
Yeager Pelatti, Christina; Pentimonti, Jill M; Justice, Laura M
2014-04-01
A considerable percentage of American children and adults fail to learn adequate literacy skills and read below a third grade level. Shared book reading is perhaps the single most important activity to prepare young children for success in reading. The primary objective of this manuscript was to critically review the methodological quality of Read Out and Read (ROR), a clinically based literacy program/intervention that teaches parents strategies to incorporate while sharing books with children as a method of preventing reading difficulties and academic struggles. A PubMed search was conducted. Articles that met three criteria were considered. First, the study must be clinically based and include parent contact with a pediatrician. Second, parental counseling ("anticipatory guidance") about the importance of parent-child book reading must be included. Third, only experimental or quasi-experimental studies were included; no additional criteria were used. Published articles from any year and peer-reviewed journal were considered. Study quality was determined using a modified version of the Downs and Black (1998) checklist assessing four categories: (1) Reporting, (2) External Validity, (3) Internal Validity-Bias, and (4) Internal Validity-Confounding. We were also interested in whether quality differed based on study design, children's age, sample size, and study outcome. Eleven studies met the inclusion criteria. The overall quality of evidence was variable across all studies; Reporting and External Validity categories were relatively strong while methodological concerns were found in the area of internal validity. Quality scores differed on the four study characteristics. Implications related to clinical practice and future studies are discussed.
Lessons Learned From Methodological Validation Research in E-Epidemiology.
Kesse-Guyot, Emmanuelle; Assmann, Karen; Andreeva, Valentina; Castetbon, Katia; Méjean, Caroline; Touvier, Mathilde; Salanave, Benoît; Deschamps, Valérie; Péneau, Sandrine; Fezeu, Léopold; Julia, Chantal; Allès, Benjamin; Galan, Pilar; Hercberg, Serge
2016-10-18
Traditional epidemiological research methods exhibit limitations leading to high logistics, human, and financial burden. The continued development of innovative digital tools has the potential to overcome many of the existing methodological issues. Nonetheless, Web-based studies remain relatively uncommon, partly due to persistent concerns about validity and generalizability. The objective of this viewpoint is to summarize findings from methodological studies carried out in the NutriNet-Santé study, a French Web-based cohort study. On the basis of the previous findings from the NutriNet-Santé e-cohort (>150,000 participants are currently included), we synthesized e-epidemiological knowledge on sample representativeness, advantageous recruitment strategies, and data quality. Overall, the reported findings support the usefulness of Web-based studies in overcoming common methodological deficiencies in epidemiological research, in particular with regard to data quality (eg, the concordance for body mass index [BMI] classification was 93%), reduced social desirability bias, and access to a wide range of participant profiles, including the hard-to-reach subgroups such as young (12.30% [15,118/122,912], <25 years) and old people (6.60% [8112/122,912], ≥65 years), unemployed or homemaker (12.60% [15,487/122,912]), and low educated (38.50% [47,312/122,912]) people. However, some selection bias remained (78.00% (95,871/122,912) of the participants were women, and 61.50% (75,590/122,912) had postsecondary education), which is an inherent aspect of cohort study inclusion; other specific types of bias may also have occurred. Given the rapidly growing access to the Internet across social strata, the recruitment of participants with diverse socioeconomic profiles and health risk exposures was highly feasible. Continued efforts concerning the identification of specific biases in e-cohorts and the collection of comprehensive and valid data are still needed. This summary of methodological findings from the NutriNet-Santé cohort may help researchers in the development of the next generation of high-quality Web-based epidemiological studies.
Reference values for muscle strength: a systematic review with a descriptive meta-analysis.
Benfica, Poliana do Amaral; Aguiar, Larissa Tavares; Brito, Sherindan Ayessa Ferreira de; Bernardino, Luane Helena Nunes; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais
2018-05-03
Muscle strength is an important component of health. To describe and evaluate the studies which have established the reference values for muscle strength on healthy individuals and to synthesize these values with a descriptive meta-analysis approach. A systematic review was performed in MEDLINE, LILACS, and SciELO databases. Studies that investigated the reference values for muscle strength of two or more appendicular/axial muscle groups of health individuals were included. Methodological quality, including risk of bias was assessed by the QUADAS-2. Data extracted included: country of the study, sample size, population characteristics, equipment/method used, and muscle groups evaluated. Of the 414 studies identified, 46 were included. Most of the studies had adequate methodological quality. Included studies evaluated: appendicular (80.4%) and axial (36.9%) muscles; adults (78.3%), elderly (58.7%), adolescents (43.5%), children (23.9%); isometric (91.3%) and isokinetic (17.4%) strength. Six studies (13%) with similar procedures were synthesized with meta-analysis. Generally, the coefficient of variation values that resulted from the meta-analysis ranged from 20.1% to 30% and were similar to those reported by the original studies. The meta-analysis synthesized the reference values of isometric strength of 14 muscle groups of the dominant/non-dominant sides of the upper/lower limbs of adults/elderly from developed countries, using dynamometers/myometer. Most of the included studies had adequate methodological quality. The meta-analysis provided reference values for the isometric strength of 14 appendicular muscle groups of the dominant/non-dominant sides, measured with dynamometers/myometers, of men/women, of adults/elderly. These data may be used to interpret the results of the evaluations and establish appropriate treatment goals. Copyright © 2018 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.
Assessment of Patient Empowerment - A Systematic Review of Measures
Barr, Paul J.; Scholl, Isabelle; Bravo, Paulina; Faber, Marjan J.; Elwyn, Glyn; McAllister, Marion
2015-01-01
Background Patient empowerment has gained considerable importance but uncertainty remains about the best way to define and measure it. The validity of empirical findings depends on the quality of measures used. This systematic review aims to provide an overview of studies assessing psychometric properties of questionnaires purporting to capture patient empowerment, evaluate the methodological quality of these studies and assess the psychometric properties of measures identified. Methods Electronic searches in five databases were combined with reference tracking of included articles. Peer-reviewed articles reporting psychometric testing of empowerment measures for adult patients in French, German, English, Portuguese and Spanish were included. Study characteristics, constructs operationalised and psychometric properties were extracted. The quality of study design, methods and reporting was assessed using the COSMIN checklist. The quality of psychometric properties was assessed using Terwee’s 2007 criteria. Findings 30 studies on 19 measures were included. Six measures are generic, while 13 were developed for a specific condition (N=4) or specialty (N=9). Most studies tested measures in English (N=17) or Swedish (N=6). Sample sizes of included studies varied from N=35 to N=8261. A range of patient empowerment constructs was operationalised in included measures. These were classified into four domains: patient states, experiences and capacities; patient actions and behaviours; patient self-determination within the healthcare relationship and patient skills development. Quality assessment revealed several flaws in methodological study quality with COSMIN scores mainly fair or poor. The overall quality of psychometric properties of included measures was intermediate to positive. Certain psychometric properties were not tested for most measures. Discussion Findings provide a basis from which to develop consensus on a core set of patient empowerment constructs and for further work to develop a (set of) appropriately validated measure(s) to capture this. The methodological quality of psychometric studies could be improved by adhering to published quality criteria. PMID:25970618
A Methodology for Studying Noninstitutionalized Psychopaths
ERIC Educational Resources Information Center
Widom, Cathy S.
1977-01-01
Presents a methodological approach to studying noninstitutionalized psychopaths and presents data on criteria associated with psychopathy. The recruitment procedure involved incorporating the characteristics of psychopathy into an advertisement. The present sample fulfilled the criteria for psychopathy, and the recruitment method used was a…
Rapid Sampling of Molecules via Skin for Diagnostic and Forensic Applications
Paliwal, Sumit; Ogura, Makoto
2010-01-01
ABSTRACT Purpose Skin provides an excellent portal for diagnostic monitoring of a variety of entities; however, there is a dearth of reliable methods for patient-friendly sampling of skin constituents. This study describes the use of low-frequency ultrasound as a one-step methodology for rapid sampling of molecules from the skin. Methods Sampling was performed using a brief exposure of 20 kHz ultrasound to skin in the presence of a sampling fluid. In vitro sampling from porcine skin was performed to assess the effectiveness of the method and its ability to sample drugs and endogenous epidermal biomolecules from the skin. Dermal presence of an antifungal drug—fluconazole and an abused substance, cocaine—was assessed in rats. Results Ultrasonic sampling captured the native profile of various naturally occurring moisturizing factors in skin. A high sampling efficiency (79 ± 13%) of topically delivered drug was achieved. Ultrasound consistently sampled greater amounts of drug from the skin compared to tape stripping. Ultrasonic sampling also detected sustained presence of cocaine in rat skin for up to 7 days as compared to its rapid disappearance from the urine. Conclusions Ultrasonic sampling provides significant advantages including enhanced sampling from deeper layers of skin and high temporal sampling sensitivity. PMID:20238151
Robles-Molina, José; Gilbert-López, Bienvenida; García-Reyes, Juan F; Molina-Díaz, Antonio
2013-12-15
The European Water Framework Directive (WFD) 2000/60/EC establishes guidelines to control the pollution of surface water by sorting out a list of priority substances that involves a significant risk to or via the aquatic systems. In this article, the analytical performance of three different sample preparation methodologies for the GC-MS/MS determination of multiclass organic contaminants-including priority comprounds from the WFD-in wastewater samples using gas chromatography-mass spectrometry was evaluated. The methodologies tested were: (a) liquid-liquid extraction (LLE) with n-hexane; (b) solid-phase extraction (SPE) with C18 cartridges and elution with ethyl acetate:dichloromethane (1:1 (v/v)), and (c) headspace solid-phase microextraction (HS-SPME) using two different fibers: polyacrylate and polydimethylsiloxane/carboxen/divinilbenzene. Identification and confirmation of the selected 57 compounds included in the study (comprising polycyclic aromatic hydrocarbons (PAHs), pesticides and other contaminants) were accomplished using gas chromatography tandem mass spectrometry (GC-MS/MS) with a triple quadrupole instrument operated in the multiple reaction monitoring (MRM) mode. Three MS/MS transitions were selected for unambiguous confirmation of the target chemicals. The different advantages and pitfalls of each method were discussed. In the case of both LLE and SPE procedures, the method was validated at two different concentration levels (15 and 150 ng L(-1)) obtaining recovery rates in the range 70-120% for most of the target compounds. In terms of analyte coverage, results with HS-SPME were not satisfactory, since 14 of the compounds tested were not properly recovered and the overall performance was worse than the other two methods tested. LLE, SPE and HS-SPME (using polyacrylate fiber) procedures also showed good linearity and precision. Using any of the three methodologies tested, limits of quantitation obtained for most of the detected compounds were in the low nanogram per liter range. © 2013 Elsevier B.V. All rights reserved.
Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs
2017-01-15
Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand. Copyright © 2016 Elsevier Ltd. All rights reserved.
40 CFR 141.803 - Coliform sampling.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Coliform sampling. 141.803 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Aircraft Drinking Water Rule § 141.803 Coliform sampling. (a) Analytical methodology. Air carriers must follow the sampling and analysis requirements under this section...
40 CFR 141.803 - Coliform sampling.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Coliform sampling. 141.803 Section 141...) NATIONAL PRIMARY DRINKING WATER REGULATIONS Aircraft Drinking Water Rule § 141.803 Coliform sampling. (a) Analytical methodology. Air carriers must follow the sampling and analysis requirements under this section...
GROUND WATER ISSUE: LOW-FLOW (MINIMAL DRAWDOWN) GROUND-WATER SAMPLING PROCEDURES
This paper is intended to provide background information on the development of low-flow sampling procedures and its application under a variety of hydrogeologic settings. The sampling methodology described in this paper assumes that the monitoring goal is to sample monitoring wel...
Wastewater Sampling Methodologies and Flow Measurement Techniques.
ERIC Educational Resources Information Center
Harris, Daniel J.; Keffer, William J.
This document provides a ready source of information about water/wastewater sampling activities using various commercial sampling and flow measurement devices. The report consolidates the findings and summarizes the activities, experiences, sampling methods, and field measurement techniques conducted by the Environmental Protection Agency (EPA),…
Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.
Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J
2016-02-01
It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.
Duarte, Elisabeth Carmen; Garcia, Leila Posenato; de Araújo, Wildo Navegantes; Velez, Maria P
2017-12-02
Zika infection during pregnancy (ZIKVP) is known to be associated with adverse outcomes. Studies on this matter involve both rare outcomes and rare exposures and methodological choices are not straightforward. Cohort studies will surely offer more robust evidences, but their efficiency must be enhanced. We aim to contribute to the debate on sample selection strategies in cohort studies to assess outcomes associated with ZKVP. A study can be statistically more efficient than another if its estimates are more accurate (precise and valid), even if the studies involve the same number of subjects. Sample size and specific design strategies can enhance or impair the statistical efficiency of a study, depending on how the subjects are distributed in subgroups pertinent to the analysis. In most ZIKVP cohort studies to date there is an a priori identification of the source population (pregnant women, regardless of their exposure status) which is then sampled or included in its entirety (census). Subsequently, the group of pregnant women is classified according to exposure (presence or absence of ZIKVP), respecting the exposed:unexposed ratio in the source population. We propose that the sample selection be done from the a priori identification of groups of pregnant women exposed and unexposed to ZIKVP. This method will allow for an oversampling (even 100%) of the pregnant women with ZKVP and a optimized sampling from the general population of pregnant women unexposed to ZIKVP, saving resources in the unexposed group and improving the expected number of incident cases (outcomes) overall. We hope that this proposal will broaden the methodological debate on the improvement of statistical power and protocol harmonization of cohort studies that aim to evaluate the association between Zika infection during pregnancy and outcomes for the offspring, as well as those with similar objectives.
Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo
2018-07-01
The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.
Guerrero-Preston, Rafael; White, James Robert; Godoy-Vitorino, Filipa; Rodríguez-Hilario, Arnold; Navarro, Kelvin; González, Herminio; Michailidi, Christina; Jedlicka, Anne; Canapp, Sierra; Bondy, Jessica; Dziedzic, Amanda; Mora-Lagos, Barbara; Rivera-Alvarez, Gustavo; Ili-Gangas, Carmen; Brebi-Mieville, Priscilla; Westra, William; Koch, Wayne; Kang, Hyunseok; Marchionni, Luigi; Kim, Young; Sidransky, David
2017-01-01
Microbiome studies show altered microbiota in head and neck squamous cell carcinoma (HNSCC), both in terms of taxonomic composition and metabolic capacity. These studies utilized a traditional bioinformatics methodology, which allows for accurate taxonomic assignment down to the genus level, but cannot accurately resolve species level membership. We applied Resphera Insight, a high-resolution methodology for 16S rRNA taxonomic assignment that is able to provide species-level context in its assignments of 16S rRNA next generation sequencing (NGS) data. Resphera Insight applied to saliva samples from HNSCC patients and healthy controls led to the discovery that a subset of HNSCC saliva samples is significantly enriched with commensal species from the vaginal flora, including Lactobacillus gasseri/johnsonii (710x higher in saliva) and Lactobacillus vaginalis (52x higher in saliva). These species were not observed in normal saliva from Johns Hopkins patients, nor in 16S rRNA NGS saliva samples from the Human Microbiome Project (HMP). Interestingly, both species were only observed in saliva from Human Papilloma Virus (HPV) positive and HPV negative oropharyngeal cancer patients. We confirmed the representation of both species in HMP data obtained from mid-vagina (n=128) and vaginal introitus (n=121) samples. Resphera Insight also led to the discovery that Fusobacterium nucleatum, an oral cavity flora commensal bacterium linked to colon cancer, is enriched (600x higher) in saliva from a subset of HNSCC patients with advanced tumors stages. Together, these high-resolution analyses on 583 samples suggest a possible role for bacterial species in the therapeutic outcome of HPV positive and HPV negative HNSCC patients. PMID:29340028
Guerrero-Preston, Rafael; White, James Robert; Godoy-Vitorino, Filipa; Rodríguez-Hilario, Arnold; Navarro, Kelvin; González, Herminio; Michailidi, Christina; Jedlicka, Anne; Canapp, Sierra; Bondy, Jessica; Dziedzic, Amanda; Mora-Lagos, Barbara; Rivera-Alvarez, Gustavo; Ili-Gangas, Carmen; Brebi-Mieville, Priscilla; Westra, William; Koch, Wayne; Kang, Hyunseok; Marchionni, Luigi; Kim, Young; Sidransky, David
2017-12-19
Microbiome studies show altered microbiota in head and neck squamous cell carcinoma (HNSCC), both in terms of taxonomic composition and metabolic capacity. These studies utilized a traditional bioinformatics methodology, which allows for accurate taxonomic assignment down to the genus level, but cannot accurately resolve species level membership. We applied Resphera Insight, a high-resolution methodology for 16S rRNA taxonomic assignment that is able to provide species-level context in its assignments of 16S rRNA next generation sequencing (NGS) data. Resphera Insight applied to saliva samples from HNSCC patients and healthy controls led to the discovery that a subset of HNSCC saliva samples is significantly enriched with commensal species from the vaginal flora, including Lactobacillus gasseri/johnsonii (710x higher in saliva) and Lactobacillus vaginalis (52x higher in saliva). These species were not observed in normal saliva from Johns Hopkins patients, nor in 16S rRNA NGS saliva samples from the Human Microbiome Project (HMP). Interestingly, both species were only observed in saliva from Human Papilloma Virus (HPV) positive and HPV negative oropharyngeal cancer patients. We confirmed the representation of both species in HMP data obtained from mid-vagina (n=128) and vaginal introitus (n=121) samples. Resphera Insight also led to the discovery that Fusobacterium nucleatum , an oral cavity flora commensal bacterium linked to colon cancer, is enriched (600x higher) in saliva from a subset of HNSCC patients with advanced tumors stages. Together, these high-resolution analyses on 583 samples suggest a possible role for bacterial species in the therapeutic outcome of HPV positive and HPV negative HNSCC patients.
Prevalence of self-medication in the adult population of Brazil: a systematic review
Domingues, Paulo Henrique Faria; Galvão, Taís Freire; de Andrade, Keitty Regina Cordeiro; de Sá, Pedro Terra Teles; Silva, Marcus Tolentino; Pereira, Mauricio Gomes
2015-01-01
OBJECTIVE To evaluate the prevalence of self-medication in Brazil’s adult population. METHODS Systematic review of cross-sectional population-based studies. The following databases were used: Medline, Embase, Scopus, ISI, CINAHL, Cochrane Library, CRD, Lilacs, SciELO, the Banco de teses brasileiras (Brazilian theses database) (Capes) and files from the Portal Domínio Público (Brazilian Public Domain). In addition, the reference lists from relevant studies were examined to identify potentially eligible articles. There were no applied restrictions in terms of the publication date, language or publication status. Data related to publication, population, methods and prevalence of self-medication were extracted by three independent researchers. Methodological quality was assessed following eight criteria related to sampling, measurement and presentation of results. The prevalences were measured from participants who used at least one medication during the recall period of the studies. RESULTS The literature screening identified 2,778 records, from which 12 were included for analysis. Most studies were conducted in the Southeastern region of Brazil, after 2000 and with a 15-day recall period. Only five studies achieved high methodological quality, of which one study had a 7-day recall period, in which the prevalence of self-medication was 22.9% (95%CI 14.6;33.9). The prevalence of self-medication in three studies of high methodological quality with a 15-day recall period was 35.0% (95%CI 29.0;40.0, I2 = 83.9%) in the adult Brazilian population. CONCLUSIONS Despite differences in the methodologies of the included studies, the results of this systematic review indicate that a significant proportion of the adult Brazilian population self-medicates. It is suggested that future research projects that assess self-medication in Brazil standardize their methods. PMID:26083944
Munday, Cathy; Domagalski, Joseph L.
2003-01-01
Evaluating the extent that bias and variability affect the interpretation of ground- and surface-water data is necessary to meet the objectives of the National Water-Quality Assessment (NAWQA) Program. Quality-control samples used to evaluate the bias and variability include annual equipment blanks, field blanks, field matrix spikes, surrogates, and replicates. This report contains quality-control results for the constituents critical to the ground- and surface-water components of the Sacramento River Basin study unit of the NAWQA Program. A critical constituent is one that was detected frequently (more than 50 percent of the time in blank samples), was detected at amounts exceeding water-quality standards or goals, or was important for the interpretation of water-quality data. Quality-control samples were collected along with ground- and surface-water samples during the high intensity phase (cycle 1) of the Sacramento River Basin NAWQA beginning early in 1996 and ending in 1998. Ground-water field blanks indicated contamination of varying levels of significance when compared with concentrations detected in environmental ground-water samples for ammonia, dissolved organic carbon, aluminum, and copper. Concentrations of aluminum in surface-water field blanks were significant when compared with environmental samples. Field blank samples collected for pesticide and volatile organic compound analyses revealed no contamination in either ground- or surface-water samples that would effect the interpretation of environmental data, with the possible exception of the volatile organic compound trichloromethane (chloroform) in ground water. Replicate samples for ground water and surface water indicate that variability resulting from sample collection, processing, and analysis was generally low. Some of the larger maximum relative percentage differences calculated for replicate samples occurred between samples having lowest absolute concentration differences and(or) values near the reporting limit. Surrogate recoveries for pesticides analyzed by gas chromatography/mass spectrometry (GC/MS), pesticides analyzed by high performance liquid chromatography (HPLC), and volatile organic compounds in ground- and surface-water samples were within the acceptable limits of 70 to 130 percent and median recovery values between 82 and 113 percent. The recovery percentages for surrogate compounds analyzed by HPLC had the highest standard deviation, 20 percent for ground-water samples and 16 percent for surface-water samples, and the lowest median values, 82 percent for ground-water samples and 91 percent for surface-water samples. Results were consistent with the recovery results described for the analytical methods. Field matrix spike recoveries for pesticide compounds analyzed using GC/MS in ground- and surface-water samples were comparable with published recovery data. Recoveries of carbofuran, a critical constituent in ground- and surface-water studies, and desethyl atrazine, a critical constituent in the ground-water study, could not be calculated because of problems with the analytical method. Recoveries of pesticides analyzed using HPLC in ground- and surface-water samples were generally low and comparable with published recovery data. Other methodological problems for HPLC analytes included nondetection of the spike compounds and estimated values of spike concentrations. Recovery of field matrix spikes for volatile organic compounds generally were within the acceptable range, 70 and 130 percent for both ground- and surface-water samples, and median recoveries from 62 to 127 percent. High or low recoveries could be related to errors in the field, such as double spiking or using spike solution past its expiration date, rather than problems during analysis. The methodological changes in the field spike protocol during the course of the Sacramento River Basin study, which included decreasing the amount of spike solu
Evidence-based pharmacotherapy of post-traumatic stress disorder (PTSD).
Ipser, Jonathan C; Stein, Dan J
2012-07-01
Post-traumatic stress disorder (PTSD) is a prevalent and disabling disorder. Recognition of neurobiological abnormalities associated with this condition suggests the potential efficacy of medication in its treatment. Nevertheless, questions regarding the efficacy of medications remain, despite general endorsement by clinical practice guidelines of selective serotonin reuptake inhibitors (SSRIs) as first-line agents in treating PTSD. This paper reviews evidence from randomized controlled trials (RCTs) for the efficacy of acute and long-term pharmacotherapy for PTSD, including the treatment of refractory PTSD. In addition, we conducted a systematic meta-analysis to compare the efficacy of different medications in treating PTSD. The effects of methodological study features (including year of publication, duration, number of centres) and sample characteristics (proportion of combat veterans, gender composition) were also tested. The largest body of evidence for short- and long-term efficacy of medication currently exists for SSRIs, with promising initial findings for the selective noradrenergic reuptake inhibitor venlafaxine and the atypical antipsychotic risperidone. Treatment effect was predicted by number of centres and recency of the study, with little evidence that sample characteristics predicted response. Evidence for the effectiveness of benzodiazepines is lacking, despite their continued use in clinical practice. Finally, the α1 antagonist prazosin and the atypical antipsychotics show some efficacy in treatment-resistant PTSD. Adequately powered trials that are designed in accordance with best-practice guidelines are required to provide conclusive evidence of clinically relevant differences in efficacy between agents in treating PTSD, and to help estimate clinical and methodological predictors of treatment response.
A systematic review of the evidence base for telehospice.
Oliver, Debra Parker; Demiris, George; Wittenberg-Lyles, Elaine; Washington, Karla; Day, Tami; Novak, Hannah
2012-01-01
Abstract The use of telehealth technologies to overcome the geographic distances in the delivery of hospice care has been termed telehospice. Although telehospice research has been conducted over the last 10 years, little is known about the comprehensive findings within the field. The purpose of this systematic article was to focus on available research and answer the question, What is the state of the evidence related to telehospice services? The article was limited to studies that had been published in the English language and indexed between January 1, 2000 and March 23, 2010. Indexed databases included PubMed and PsycINFO and contained specified key words. Only research published in peer review journals and reporting empirical data, rather than opinion or editorials, were included. A two-part scoring framework was modified and applied to assess the methodological rigor and pertinence of each study. Scoring criteria allowed the evaluation of both quantitative and qualitative methodologies. Twenty-six studies were identified with the search strategy. Although limited in number and in strength, studies have evaluated the use of a variety of technologies, attitudes toward use by providers and consumers, clinical outcomes, barriers, readiness, and cost. A small evidence base for telehospice has emerged over the last 10 years. Although the evidence is of medium strength, its pertinence is strong. The evidence base could be strengthened with randomized trials and additional clinical-outcome-focused research in larger randomized samples and in qualitative studies with better-described samples.
Use of a machine learning framework to predict substance use disorder treatment success
Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan
2017-01-01
There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed. PMID:28394905
Use of a machine learning framework to predict substance use disorder treatment success.
Acion, Laura; Kelmansky, Diana; van der Laan, Mark; Sahker, Ethan; Jones, DeShauna; Arndt, Stephan
2017-01-01
There are several methods for building prediction models. The wealth of currently available modeling techniques usually forces the researcher to judge, a priori, what will likely be the best method. Super learning (SL) is a methodology that facilitates this decision by combining all identified prediction algorithms pertinent for a particular prediction problem. SL generates a final model that is at least as good as any of the other models considered for predicting the outcome. The overarching aim of this work is to introduce SL to analysts and practitioners. This work compares the performance of logistic regression, penalized regression, random forests, deep learning neural networks, and SL to predict successful substance use disorders (SUD) treatment. A nationwide database including 99,013 SUD treatment patients was used. All algorithms were evaluated using the area under the receiver operating characteristic curve (AUC) in a test sample that was not included in the training sample used to fit the prediction models. AUC for the models ranged between 0.793 and 0.820. SL was superior to all but one of the algorithms compared. An explanation of SL steps is provided. SL is the first step in targeted learning, an analytic framework that yields double robust effect estimation and inference with fewer assumptions than the usual parametric methods. Different aspects of SL depending on the context, its function within the targeted learning framework, and the benefits of this methodology in the addiction field are discussed.
Literature Review of Research on Chronic Pain and Yoga in Military Populations
Miller, Shari; Gaylord, Susan; Buben, Alex; Brintz, Carrie; Rae Olmsted, Kristine; Asefnia, Nakisa; Bartoszek, Michael
2017-01-01
Background: Although yoga is increasingly being provided to active duty soldiers and veterans, studies with military populations are limited and effects on chronic pain are largely unknown. We reviewed the existing body of literature and provide recommendations for future research. Methods: We conducted a literature review of electronic databases (PubMed, PsychINFO, Web of Science, Science Citation Index Expanded, Social Sciences Citation Index, Conference Proceedings Citation Index—Science, and Conference Proceedings Citation Index—Social Science & Humanities). The studies were reviewed for characteristics such as mean age of participants, sample size, yoga type, and study design. Only peer-reviewed studies were included in the review. Results: The search yielded only six studies that examined pain as an outcome of yoga for military populations. With one exception, studies were with veteran populations. Only one study was conducted with Operation Enduring Freedom (OEF) or Operation Iraqi Freedom (OIF) veterans. One study was a randomized controlled trial (RCT). Four of the five studies remaining used pre/post design, while the last study used a post-only design. Conclusions: Studies on the use of yoga to treat chronic pain in military populations are in their infancy. Methodological weaknesses include small sample sizes, a lack of studies with key groups (active duty, OEF/IEF veterans), and use of single group uncontrolled designs (pre/post; post only) for all but one study. Future research is needed to address these methodological limitations and build on this small body of literature. PMID:28930278
Analytical methodologies for broad metabolite coverage of exhaled breath condensate.
Aksenov, Alexander A; Zamuruyev, Konstantin O; Pasamontes, Alberto; Brown, Joshua F; Schivo, Michael; Foutouhi, Soraya; Weimer, Bart C; Kenyon, Nicholas J; Davis, Cristina E
2017-09-01
Breath analysis has been gaining popularity as a non-invasive technique that is amenable to a broad range of medical uses. One of the persistent problems hampering the wide application of the breath analysis method is measurement variability of metabolite abundances stemming from differences in both sampling and analysis methodologies used in various studies. Mass spectrometry has been a method of choice for comprehensive metabolomic analysis. For the first time in the present study, we juxtapose the most commonly employed mass spectrometry-based analysis methodologies and directly compare the resultant coverages of detected compounds in exhaled breath condensate in order to guide methodology choices for exhaled breath condensate analysis studies. Four methods were explored to broaden the range of measured compounds across both the volatile and non-volatile domain. Liquid phase sampling with polyacrylate Solid-Phase MicroExtraction fiber, liquid phase extraction with a polydimethylsiloxane patch, and headspace sampling using Carboxen/Polydimethylsiloxane Solid-Phase MicroExtraction (SPME) followed by gas chromatography mass spectrometry were tested for the analysis of volatile fraction. Hydrophilic interaction liquid chromatography and reversed-phase chromatography high performance liquid chromatography mass spectrometry were used for analysis of non-volatile fraction. We found that liquid phase breath condensate extraction was notably superior compared to headspace extraction and differences in employed sorbents manifested altered metabolite coverages. The most pronounced effect was substantially enhanced metabolite capture for larger, higher-boiling compounds using polyacrylate SPME liquid phase sampling. The analysis of the non-volatile fraction of breath condensate by hydrophilic and reverse phase high performance liquid chromatography mass spectrometry indicated orthogonal metabolite coverage by these chromatography modes. We found that the metabolite coverage could be enhanced significantly with the use of organic solvent as a device rinse after breath sampling to collect the non-aqueous fraction as opposed to neat breath condensate sample. Here, we show the detected ranges of compounds in each case and provide a practical guide for methodology selection for optimal detection of specific compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
Cuesta, D; Varela, M; Miró, P; Galdós, P; Abásolo, D; Hornero, R; Aboy, M
2007-07-01
Body temperature is a classical diagnostic tool for a number of diseases. However, it is usually employed as a plain binary classification function (febrile or not febrile), and therefore its diagnostic power has not been fully developed. In this paper, we describe how body temperature regularity can be used for diagnosis. Our proposed methodology is based on obtaining accurate long-term temperature recordings at high sampling frequencies and analyzing the temperature signal using a regularity metric (approximate entropy). In this study, we assessed our methodology using temperature registers acquired from patients with multiple organ failure admitted to an intensive care unit. Our results indicate there is a correlation between the patient's condition and the regularity of the body temperature. This finding enabled us to design a classifier for two outcomes (survival or death) and test it on a dataset including 36 subjects. The classifier achieved an accuracy of 72%.
Gendre, Laura; Marchante, Veronica; Abhyankar, Hrushikesh A; Blackburn, Kim; Temple, Clive; Brighton, James L
2016-01-01
This work focuses on the release of nanoparticles from commercially used nanocomposites during machining operations. A reliable and repeatable method was developed to assess the intentionally exposure to nanoparticles, in particular during drilling. This article presents the description and validation of results obtained from a new prototype used for the measurement and monitoring of nanoparticles in a controlled environment. This methodology was compared with the methodologies applied in other studies. Also, some preliminary experiments on drilling nanocomposites are included. Size, shape and chemical composition of the released nanoparticles were investigated in order to understand their hazard potential. No significant differences were found in the amount of nanoparticles released between samples with and without nanoadditives. Also, no chemical alteration was observed between the dust generated and the bulk material. Finally, further developments of the prototype are proposed.
Determination of element affinities by density fractionation of bulk coal samples
Querol, X.; Klika, Z.; Weiss, Z.; Finkelman, R.B.; Alastuey, A.; Juan, R.; Lopez-Soler, A.; Plana, F.; Kolker, A.; Chenery, S.R.N.
2001-01-01
A review has been made of the various methods of determining major and trace element affinities for different phases, both mineral and organic in coals, citing their various strengths and weaknesses. These include mathematical deconvolution of chemical analyses, direct microanalysis, sequential extraction procedures and density fractionation. A new methodology combining density fractionation with mathematical deconvolution of chemical analyses of whole coals and their density fractions has been evaluated. These coals formed part of the IEA-Coal Research project on the Modes of Occurrence of Trace Elements in Coal. Results were compared to a previously reported sequential extraction methodology and showed good agreement for most elements. For particular elements (Be, Mo, Cu, Se and REEs) in specific coals where disagreement was found, it was concluded that the occurrence of rare trace element bearing phases may account for the discrepancy, and modifications to the general procedure must be made to account for these.
Exercise as Treatment for Anxiety: Systematic Review and Analysis
Stonerock, Gregory L.; Hoffman, Benson M.; Smith, Patrick J.; Blumenthal, James A.
2015-01-01
Background Exercise has been shown to reduce symptoms of anxiety, but few studies have studied exercise in individuals pre-selected because of their high anxiety. Purpose To review and critically evaluate studies of exercise training in adults with either high levels of anxiety or an anxiety disorder. Methods We conducted a systematic review of randomized clinical trials (RCTs) in which anxious adults were randomized to an exercise or non-exercise control condition. Data were extracted concerning anxiety outcomes and study design. Existing meta-analyses were also reviewed. Results Evidence from 12 RCTs suggested benefits of exercise, for select groups, similar to established treatments and greater than placebo. However, most studies had significant methodological limitations, including small sample sizes, concurrent therapies, and inadequate assessment of adherence and fitness levels. Conclusions Exercise may be a useful treatment for anxiety, but lack of data from rigorous, methodologically sound RCTs precludes any definitive conclusions about its effectiveness. PMID:25697132
Exercise as Treatment for Anxiety: Systematic Review and Analysis.
Stonerock, Gregory L; Hoffman, Benson M; Smith, Patrick J; Blumenthal, James A
2015-08-01
Exercise has been shown to reduce symptoms of anxiety, but few studies have studied exercise in individuals preselected because of their high anxiety. The objective of this study is to review and critically evaluate studies of exercise training in adults with either high levels of anxiety or an anxiety disorder. We conducted a systematic review of randomized clinical trials (RCTs) in which anxious adults were randomized to an exercise or nonexercise control condition. Data were extracted concerning anxiety outcomes and study design. Existing meta-analyses were also reviewed. Evidence from 12 RCTs suggested benefits of exercise, for select groups, similar to established treatments and greater than placebo. However, most studies had significant methodological limitations, including small sample sizes, concurrent therapies, and inadequate assessment of adherence and fitness levels. Exercise may be a useful treatment for anxiety, but lack of data from rigorous, methodologically sound RCTs precludes any definitive conclusions about its effectiveness.
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
Challenges and perspectives in quantitative NMR.
Giraudeau, Patrick
2017-01-01
This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Domínguez, Marina A; Grünhut, Marcos; Pistonesi, Marcelo F; Di Nezio, María S; Centurión, María E
2012-05-16
An automatic flow-batch system that includes two borosilicate glass chambers to perform sample digestion and cold vapor atomic absorption spectroscopy determination of mercury in honey samples was designed. The sample digestion was performed by using a low-cost halogen lamp to obtain the optimum temperature. Optimization of the digestion procedure was done using a Box-Behnken experimental design. A linear response was observed from 2.30 to 11.20 μg Hg L(-1). The relative standard deviation was 3.20% (n = 11, 6.81 μg Hg L(-1)), the sample throughput was 4 sample h(-1), and the detection limit was 0.68 μg Hg L(-1). The obtained results with the flow-batch method are in good agreement with those obtained with the reference method. The flow-batch system is simple, allows the use of both chambers simultaneously, is seen as a promising methodology for achieving green chemistry goals, and is a good proposal to improving the quality control of honey.
Methodological Issues in Curriculum-Based Reading Assessment.
ERIC Educational Resources Information Center
Fuchs, Lynn S.; And Others
1984-01-01
Three studies involving elementary students examined methodological issues in curriculum-based reading assessment. Results indicated that (1) whereas sample duration did not affect concurrent validity, increasing duration reduced performance instability and increased performance slopes and (2) domain size was related inversely to performance slope…
Human Prenatal Effects: Methodological Problems and Some Suggested Solutions
ERIC Educational Resources Information Center
Copans, Stuart A.
1974-01-01
Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)
1998 motor vehicle occupant safety survey. Volume 1, methodology report
DOT National Transportation Integrated Search
2000-03-01
This is the Methodology Report for the 1998 Motor Vehicle Occupant Safety Survey. The survey is conducted on a biennial basis (initiated in 1994), and is administered by telephone to a randomly selected national sample. Two questionnaires are used, e...
Evaluation of bias and logistics in a survey of adults at increased risk for oral health decrements.
Gilbert, G H; Duncan, R P; Kulley, A M; Coward, R T; Heft, M W
1997-01-01
Designing research to include sufficient respondents in groups at highest risk for oral health decrements can present unique challenges. Our purpose was to evaluate bias and logistics in this survey of adults at increased risk for oral health decrements. We used a telephone survey methodology that employed both listed numbers and random digit dialing to identify dentate persons 45 years old or older and to oversample blacks, poor persons, and residents of nonmetropolitan counties. At a second stage, a subsample of the respondents to the initial telephone screening was selected for further study, which consisted of a baseline in-person interview and a clinical examination. We assessed bias due to: (1) limiting the sample to households with telephones, (2) using predominantly listed numbers instead of random digit dialing, and (3) nonresponse at two stages of data collection. While this approach apparently created some biases in the sample, they were small in magnitude. Specifically, limiting the sample to households with telephones biased the sample overall toward more females, larger households, and fewer functionally impaired persons. Using predominantly listed numbers led to a modest bias toward selection of persons more likely to be younger, healthier, female, have had a recent dental visit, and reside in smaller households. Blacks who were selected randomly at a second stage were more likely to participate in baseline data gathering than their white counterparts. Comparisons of the data obtained in this survey with those from recent national surveys suggest that this methodology for sampling high-risk groups did not substantively bias the sample with respect to two important dental parameters, prevalence of edentulousness and dental care use, nor were conclusions about multivariate associations with dental care recency substantively affected. This method of sampling persons at high risk for oral health decrements resulted in only modest bias with respect to the population of interest.
Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric
2015-10-01
Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lunden, Melissa; Faulkner, David; Heredia, Elizabeth
2012-10-01
This report documents experiments performed in three homes to assess the methodology used to determine air exchange rates using passive tracer techniques. The experiments used four different tracer gases emitted simultaneously but implemented with different spatial coverage in the home. Two different tracer gas sampling methods were used. The results characterize the factors of the execution and analysis of the passive tracer technique that affect the uncertainty in the calculated air exchange rates. These factors include uncertainties in tracer gas emission rates, differences in measured concentrations for different tracer gases, temporal and spatial variability of the concentrations, the comparison betweenmore » different gas sampling methods, and the effect of different ventilation conditions.« less
Relationships between Death Anxiety and Quality of Life in Iranian Patients with Cancer
Soleimani, Mohammad A.; Lehto, Rebecca H.; Negarandeh, Reza; Bahrami, Nasim; Nia, Hamid Sharif
2016-01-01
Objective: The purpose of the study was to examine relationships between death anxiety and quality of life (QOL) parameters of patients with cancer in the Iranian sociocultural context. Methods: A descriptive, correlational methodology was used. The sample included 330 patients. Demographics, health information, religious behaviors, death anxiety, and QOL data were collected. Results: Overall death anxiety levels were moderate with satisfactory overall QOL. Death anxiety was predictive of lowered QOL. Female patients had lower QOL and higher death anxiety compared to men Conclusions: Findings support that higher death anxiety negatively impacts QOL in an Iranian sample with cancer. Alleviation of existential concerns in vulnerable patients may palliate mental health distress associated with facing cancer and its challenging treatments. PMID:27981157
Framework for managing mycotoxin risks in the food industry.
Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie
2014-12-01
We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.
Modeling Payload Stowage Impacts on Fire Risks On-Board the International Space Station
NASA Technical Reports Server (NTRS)
Anton, Kellie e.; Brown, Patrick F.
2010-01-01
The purpose of this presentation is to determine the risks of fire on-board the ISS due to non-standard stowage. ISS stowage is constantly being reexamined for optimality. Non-standard stowage involves stowing items outside of rack drawers, and fire risk is a key concern and is heavily mitigated. A Methodology is needed to account for fire risk due to non-standard stowage to capture the risk. The contents include: 1) Fire Risk Background; 2) General Assumptions; 3) Modeling Techniques; 4) Event Sequence Diagram (ESD); 5) Qualitative Fire Analysis; 6) Sample Qualitative Results for Fire Risk; 7) Qualitative Stowage Analysis; 8) Sample Qualitative Results for Non-Standard Stowage; and 9) Quantitative Analysis Basic Event Data.
Why minimally invasive skin sampling techniques? A bright scientific future.
Wang, Christina Y; Maibach, Howard I
2011-03-01
There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.
Epistemological Issues in Astronomy Education Research: How Big of a Sample is "Big Enough"?
NASA Astrophysics Data System (ADS)
Slater, Stephanie; Slater, T. F.; Souri, Z.
2012-01-01
As astronomy education research (AER) continues to evolve into a sophisticated enterprise, we must begin to grapple with defining our epistemological parameters. Moreover, as we attempt to make pragmatic use of our findings, we must make a concerted effort to communicate those parameters in a sensible way to the larger astronomical community. One area of much current discussion involves a basic discussion of methodologies, and subsequent sample sizes, that should be considered appropriate for generating knowledge in the field. To address this question, we completed a meta-analysis of nearly 1,000 peer-reviewed studies published in top tier professional journals. Data related to methodologies and sample sizes were collected from "hard science” and "human science” journals to compare the epistemological systems of these two bodies of knowledge. Working back in time from August 2011, the 100 most recent studies reported in each journal were used as a data source: Icarus, ApJ and AJ, NARST, IJSE and SciEd. In addition, data was collected from the 10 most recent AER dissertations, a set of articles determined by the science education community to be the most influential in the field, and the nearly 400 articles used as reference materials for the NRC's Taking Science to School. Analysis indicates these bodies of knowledge have a great deal in common; each relying on a large variety of methodologies, and each building its knowledge through studies that proceed from surprisingly low sample sizes. While both fields publish a small percentage of studies with large sample sizes, the vast majority of top tier publications consist of rich studies of a small number of objects. We conclude that rigor in each field is determined not by a circumscription of methodologies and sample sizes, but by peer judgments that the methods and sample sizes are appropriate to the research question.