Q-Sample Construction: A Critical Step for a Q-Methodological Study.
Paige, Jane B; Morin, Karen H
2016-01-01
Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.
CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample
Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.
2012-01-01
Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... determine endpoints; questionnaire design and analyses; and presentation of survey results. To date, FDA has..., the workshop will invest considerable time in identifying best methodological practices for conducting... sample, sample size, question design, process, and endpoints. Panel 2 will focus on alternatives to...
Sampling methods to the statistical control of the production of blood components.
Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo
2017-12-01
The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.
Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis
Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín
2010-01-01
Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506
Soininen, Päivi; Putkonen, Hanna; Joffe, Grigori; Korkeila, Jyrki; Välimäki, Maritta
2014-06-04
Despite improvements in psychiatric inpatient care, patient restrictions in psychiatric hospitals are still in use. Studying perceptions among patients who have been secluded or physically restrained during their hospital stay is challenging. We sought to review the methodological and ethical challenges in qualitative and quantitative studies aiming to describe patients' perceptions of coercive measures, especially seclusion and physical restraints during their hospital stay. Systematic mixed studies review was the study method. Studies reporting patients' perceptions of coercive measures, especially seclusion and physical restraints during hospital stay were included. Methodological issues such as study design, data collection and recruitment process, participants, sampling, patient refusal or non-participation, and ethical issues such as informed consent process, and approval were synthesized systematically. Electronic searches of CINALH, MEDLINE, PsychINFO and The Cochrane Library (1976-2012) were carried out. Out of 846 initial citations, 32 studies were included, 14 qualitative and 18 quantitative studies. A variety of methodological approaches were used, although descriptive and explorative designs were used in most cases. Data were mainly collected in qualitative studies by interviews (n = 13) or in quantitative studies by self-report questionnaires (n = 12). The recruitment process was explained in 59% (n = 19) of the studies. In most cases convenience sampling was used, yet five studies used randomization. Patient's refusal or non-participation was reported in 37% (n = 11) of studies. Of all studies, 56% (n = 18) had reported undergone an ethical review process in an official board or committee. Respondents were informed and consent was requested in 69% studies (n = 22). The use of different study designs made comparison methodologically challenging. The timing of data collection (considering bias and confounding factors) and the reasons for non-participation of eligible participants are likewise methodological challenges, e.g. recommended flow charts could aid the information. Other challenges identified were the recruitment of large and representative samples. Ethical challenges included requesting participants' informed consent and respecting ethical procedures.
Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir
2016-07-15
Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample. Copyright © 2016 Elsevier Ltd. All rights reserved.
Muñoz-Colmenero, Marta; Martínez, Jose Luis; Roca, Agustín; Garcia-Vazquez, Eva
2017-01-01
The Next Generation Sequencing methodologies are considered the next step within DNA-based methods and their applicability in different fields is being evaluated. Here, we tested the usefulness of the Ion Torrent Personal Genome Machine (PGM) in food traceability analyzing candies as a model of high processed foods, and compared the results with those obtained by PCR-cloning-sequencing (PCR-CS). The majority of samples exhibited consistency between methodologies, yielding more information and species per product from the PGM platform than PCR-CS. Significantly higher AT-content in sequences of the same species was also obtained from PGM. This together with some taxonomical discrepancies between methodologies suggest that the PGM platform is still pre-mature for its use in food traceability of complex highly processed products. It could be a good option for analysis of less complex food, saving time and cost per sample. Copyright © 2016 Elsevier Ltd. All rights reserved.
Choi, Byungchuel; Lee, Nan Bok
2014-01-01
Developed by Helen Bonny, Guided Imagery and Music (BMGIM) has mainly been used to assist people with mental health issues. In order to provide clients with the most effective therapy, we need to examine the BMGIM process from the clients' perspective, rather than the therapists.' Understanding the types and characteristics of clients' experiences within the BMGIM process would be helpful to therapists. In order to assess clients' experiences more objectively, a different research methodology is needed to measure and compare the perspectives of clients in the BMGIM process. The purpose of this study was to identify the types and characteristics of perceptions in clients with mental health problems of the BMGIM experience. Q methodology was used to characterize client BMGIM perceptions. Scores from Q samples were coded into Q sample scores in order to calculate Q sort collected from a P sample of 20 participants. Participants were involved in the Q sorting as Q sorters and P sample. Q factor analysis was conducted using the QUANL program. The types and characteristics of the participants' perceptions were analyzed for three segments of the BMGIM session. From a factor analysis, (a) two factors were identified in the before music experience segment, (b) three factors in the during music experience segment, and (c) three factors in the after music experience segment. Factors that intervened in the therapeutic process of BMGIM were obtained from participants' direct GIM experiences. The knowledge of the types and characteristics of participants' perceptions of the GIM process will help therapists deliver more effective therapeutic interventions. Q methodology may also contribute to gaining a better understanding of BMGIM process. © the American Music Therapy Association 2014. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
[Qualitative research methodology in health care].
Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara
2017-03-01
Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.
Zuckerman, Daniel M; Chong, Lillian T
2017-05-22
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
2014-01-01
Background Despite improvements in psychiatric inpatient care, patient restrictions in psychiatric hospitals are still in use. Studying perceptions among patients who have been secluded or physically restrained during their hospital stay is challenging. We sought to review the methodological and ethical challenges in qualitative and quantitative studies aiming to describe patients’ perceptions of coercive measures, especially seclusion and physical restraints during their hospital stay. Methods Systematic mixed studies review was the study method. Studies reporting patients’ perceptions of coercive measures, especially seclusion and physical restraints during hospital stay were included. Methodological issues such as study design, data collection and recruitment process, participants, sampling, patient refusal or non-participation, and ethical issues such as informed consent process, and approval were synthesized systematically. Electronic searches of CINALH, MEDLINE, PsychINFO and The Cochrane Library (1976-2012) were carried out. Results Out of 846 initial citations, 32 studies were included, 14 qualitative and 18 quantitative studies. A variety of methodological approaches were used, although descriptive and explorative designs were used in most cases. Data were mainly collected in qualitative studies by interviews (n = 13) or in quantitative studies by self-report questionnaires (n = 12). The recruitment process was explained in 59% (n = 19) of the studies. In most cases convenience sampling was used, yet five studies used randomization. Patient’s refusal or non-participation was reported in 37% (n = 11) of studies. Of all studies, 56% (n = 18) had reported undergone an ethical review process in an official board or committee. Respondents were informed and consent was requested in 69% studies (n = 22). Conclusions The use of different study designs made comparison methodologically challenging. The timing of data collection (considering bias and confounding factors) and the reasons for non-participation of eligible participants are likewise methodological challenges, e.g. recommended flow charts could aid the information. Other challenges identified were the recruitment of large and representative samples. Ethical challenges included requesting participants’ informed consent and respecting ethical procedures. PMID:24894162
In situ study of live specimens in an environmental scanning electron microscope.
Tihlaříková, Eva; Neděla, Vilém; Shiojiri, Makoto
2013-08-01
In this paper we introduce new methodology for the observation of living biological samples in an environmental scanning electron microscope (ESEM). The methodology is based on an unconventional initiation procedure for ESEM chamber pumping, free from purge-flood cycles, and on the ability to control thermodynamic processes close to the sample. The gradual and gentle change of the working environment from air to water vapor enables the study of not only living samples in dynamic in situ experiments and their manifestation of life (sample walking) but also its experimentally stimulated physiological reactions. Moreover, Monte Carlo simulations of primary electron beam energy losses in a water layer on the sample surface were studied; consequently, the influence of the water thickness on radiation, temperature, or chemical damage of the sample was considered.
Perez, Pablo A; Hintelman, Holger; Quiroz, Waldo; Bravo, Manuel A
2017-11-01
In the present work, the efficiency of distillation process for extracting monomethylmercury (MMHg) from soil samples was studied and optimized using an experimental design methodology. The influence of soil composition on MMHg extraction was evaluated by testing of four soil samples with different geochemical characteristics. Optimization suggested that the acid concentration and the duration of the distillation process were most significant and the most favorable conditions, established as a compromise for the studied soils, were determined to be a 70 min distillation using an 0.2 M acid. Corresponding limits of detection (LOD) and quantification (LOQ) were 0.21 and 0.7 pg absolute, respectively. The optimized methodology was applied with satisfactory results to soil samples and was compared to a reference methodology based on isotopic dilution analysis followed by gas chromatography-inductively coupled plasma mass spectrometry (IDA-GC-ICP-MS). Using the optimized conditions, recoveries ranged from 82 to 98%, which is an increase of 9-34% relative to the previously used standard operating procedure. Finally, the validated methodology was applied to quantify MMHg in soils collected from different sites impacted by coal fired power plants in the north-central zone of Chile, measuring MMHg concentrations ranging from 0.091 to 2.8 ng g -1 . These data are to the best of our knowledge the first MMHg measurements reported for Chile. Copyright © 2017 Elsevier Ltd. All rights reserved.
COMPONENTS IDENTIFIED IN ENERGY-RELATED WASTES AND EFFLUENTS
A state-of-the-art review of the characterization of solid wastes and aqueous effluents generated by energy-related processes was conducted. The reliability of these data was evaluated according to preselected criteria or sample source, sampling and analytical methodology, and da...
The National Visitor Use Monitoring methodology and final results for round 1
S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold
2011-01-01
A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...
Simulation of Ejecta Production and Mixing Process of Sn Sample under shock loading
NASA Astrophysics Data System (ADS)
Wang, Pei; Chen, Dawei; Sun, Haiquan; Ma, Dongjun
2017-06-01
Ejection may occur when a strong shock wave release at the free surface of metal material and the ejecta of high-speed particulate matter will be formed and further mixed with the surrounding gas. Ejecta production and its mixing process has been one of the most difficult problems in shock physics remain unresolved, and have many important engineering applications in the imploding compression science. The present paper will introduce a methodology for the theoretical modeling and numerical simulation of the complex ejection and mixing process. The ejecta production is decoupled with the particle mixing process, and the ejecta state can be achieved by the direct numerical simulation for the evolution of initial defect on the metal surface. Then the particle mixing process can be simulated and resolved by a two phase gas-particle model which uses the aforementioned ejecta state as the initial condition. A preliminary ejecta experiment of planar Sn metal Sample has validated the feasibility of the proposed methodology.
Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing
NASA Technical Reports Server (NTRS)
Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana
2013-01-01
The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.
Differences in Identity Style and Process: Can Less Be More
ERIC Educational Resources Information Center
Reio, Thomas G., Jr.; Portes, Pedro R.; Nixon, Casey B.
2014-01-01
This study examines relationships between identity status and process measure scores that advance our understanding of methodological characteristics in the context of gender and age. A sample of 391 adolescents and adults (215 males, 176 females) completed the Identity Style Inventory (ISI; Berzonsky, 1992) and Ego Identity Process Questionnaire…
Respondent-Driven Sampling: An Assessment of Current Methodology.
Gile, Krista J; Handcock, Mark S
2010-08-01
Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.
Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J
2018-06-01
The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A
2018-05-01
In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.
New Methodology for Natural Gas Production Estimates
2010-01-01
A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software
Zuckerman, Daniel M.; Chong, Lillian T.
2018-01-01
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772
How to do a grounded theory study: a worked example of a study of dental practices
2011-01-01
Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community. PMID:21902844
How to do a grounded theory study: a worked example of a study of dental practices.
Sbaraini, Alexandra; Carter, Stacy M; Evans, R Wendell; Blinkhorn, Anthony
2011-09-09
Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. We documented a worked example of using grounded theory methodology in practice. We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-17
... risk-based sampling methodologies will be reviewed and approved by the contract auditors for... the disbursing office. All interim vouchers are subject to an audit of actual costs incurred after... process currently referenced [[Page 9618
Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.
Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun
2018-01-01
Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.
Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta
2015-01-01
Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788
Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio
2015-01-01
Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.
Yu, Juping
2009-04-01
Qualitative approaches have been increasingly used to explore ethnic differences in teenage sexual behavior, and methodological issues of conducting such research often remain unaddressed. This article discusses issues related to sampling, rapport, language, and ethnical considerations arising while undertaking research on attitudes toward teenage sexual behavior held by Chinese British families. It highlights the value of using snowball sampling, the importance of establishing rapport, and some advantages of matching the ethnic background between researcher and participants. The researcher's gender and social and cultural backgrounds affect research processes and findings, and this itself merits further reflection.
NASA Astrophysics Data System (ADS)
Srivastava, Y.; Srivastava, S.; Boriwal, L.
2016-09-01
Mechanical alloying is a novelistic solid state process that has received considerable attention due to many advantages over other conventional processes. In the present work, Co2FeAl healer alloy powder, prepared successfully from premix basic powders of Cobalt (Co), Iron (Fe) and Aluminum (Al) in stoichiometric of 60Co-26Fe-14Al (weight %) by novelistic mechano-chemical route. Magnetic properties of mechanically alloyed powders were characterized by vibrating sample magnetometer (VSM). 2 factor 5 level design matrix was applied to experiment process. Experimental results were used for response surface methodology. Interaction between the input process parameters and the response has been established with the help of regression analysis. Further analysis of variance technique was applied to check the adequacy of developed model and significance of process parameters. Test case study was performed with those parameters, which was not selected for main experimentation but range was same. Response surface methodology, the process parameters must be optimized to obtain improved magnetic properties. Further optimum process parameters were identified using numerical and graphical optimization techniques.
77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-19
... selected using sampling methodologies will be reviewed and approved by the contract auditors for... Disallowing costs after incurrence. * * * * * (b) * * * (i) The contract auditor is the authorized...
NASA Astrophysics Data System (ADS)
Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza
2018-03-01
This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.
USDA-ARS?s Scientific Manuscript database
The prevalence and serogroups of Salmonella recovered following air chilling were determined for both enriched neck skin and matching enriched whole carcass samples. Commercially processed and eviscerated carcasses were air chilled to 4C before removing the neck skin (8.3 g) and stomaching in 83 mL...
Methodological issues with adaptation of clinical trial design.
Hung, H M James; Wang, Sue-Jane; O'Neill, Robert T
2006-01-01
Adaptation of clinical trial design generates many issues that have not been resolved for practical applications, though statistical methodology has advanced greatly. This paper focuses on some methodological issues. In one type of adaptation such as sample size re-estimation, only the postulated value of a parameter for planning the trial size may be altered. In another type, the originally intended hypothesis for testing may be modified using the internal data accumulated at an interim time of the trial, such as changing the primary endpoint and dropping a treatment arm. For sample size re-estimation, we make a contrast between an adaptive test weighting the two-stage test statistics with the statistical information given by the original design and the original sample mean test with a properly corrected critical value. We point out the difficulty in planning a confirmatory trial based on the crude information generated by exploratory trials. In regards to selecting a primary endpoint, we argue that the selection process that allows switching from one endpoint to the other with the internal data of the trial is not very likely to gain a power advantage over the simple process of selecting one from the two endpoints by testing them with an equal split of alpha (Bonferroni adjustment). For dropping a treatment arm, distributing the remaining sample size of the discontinued arm to other treatment arms can substantially improve the statistical power of identifying a superior treatment arm in the design. A common difficult methodological issue is that of how to select an adaptation rule in the trial planning stage. Pre-specification of the adaptation rule is important for the practicality consideration. Changing the originally intended hypothesis for testing with the internal data generates great concerns to clinical trial researchers.
Abras, Alba; Ballart, Cristina; Llovet, Teresa; Roig, Carme; Gutiérrez, Cristina; Tebar, Silvia; Berenguer, Pere; Pinazo, María-Jesús; Posada, Elizabeth; Gascón, Joaquim; Schijman, Alejandro G; Gállego, Montserrat; Muñoz, Carmen
2018-01-01
Polymerase chain reaction (PCR) has become a useful tool for the diagnosis of Trypanosoma cruzi infection. The development of automated DNA extraction methodologies and PCR systems is an important step toward the standardization of protocols in routine diagnosis. To date, there are only two commercially available Real-Time PCR assays for the routine laboratory detection of T. cruzi DNA in clinical samples: TCRUZIDNA.CE (Diagnostic Bioprobes Srl) and RealCycler CHAG (Progenie Molecular). Our aim was to evaluate the RealCycler CHAG assay taking into account the whole process. We assessed the usefulness of an automated DNA extraction system based on magnetic particles (EZ1 Virus Mini Kit v2.0, Qiagen) combined with a commercially available Real-Time PCR assay targeting satellite DNA (SatDNA) of T. cruzi (RealCycler CHAG), a methodology used for routine diagnosis in our hospital. It was compared with a well-known strategy combining a commercial DNA isolation kit based on silica columns (High Pure PCR Template Preparation Kit, Roche Diagnostics) with an in-house Real-Time PCR targeting SatDNA. The results of the two methodologies were in almost perfect agreement, indicating they can be used interchangeably. However, when variations in protocol factors were applied (sample treatment, extraction method and Real-Time PCR), the results were less convincing. A comprehensive fine-tuning of the whole procedure is the key to successful results. Guanidine EDTA-blood (GEB) samples are not suitable for DNA extraction based on magnetic particles due to inhibition, at least when samples are not processed immediately. This is the first study to evaluate the RealCycler CHAG assay taking into account the overall process, including three variables (sample treatment, extraction method and Real-Time PCR). Our findings may contribute to the harmonization of protocols between laboratories and to a wider application of Real-Time PCR in molecular diagnostic laboratories associated with health centers.
Hermans, Artur; Kieninger, Clemens; Koskinen, Kalle; Wickberg, Andreas; Solano, Eduardo; Dendooven, Jolien; Kauranen, Martti; Clemmen, Stéphane; Wegener, Martin; Koos, Christian; Baets, Roel
2017-01-01
The determination of the second-order susceptibility (χ(2)) of thin film samples can be a delicate matter since well-established χ(2) measurement methodologies such as the Maker fringe technique are best suited for nonlinear materials with large thicknesses typically ranging from tens of microns to several millimeters. Here we compare two different second-harmonic generation setups and the corresponding measurement methodologies that are especially advantageous for thin film χ(2) characterization. This exercise allows for cross-checking the χ(2) obtained for identical samples and identifying the main sources of error for the respective techniques. The development of photonic integrated circuits makes nonlinear thin films of particular interest, since they can be processed into long waveguides to create efficient nonlinear devices. The investigated samples are ABC-type nanolaminates, which were reported recently by two different research groups. However, the subsequent analysis can be useful for all researchers active in the field of thin film χ(2) characterization. PMID:28317938
Sampling design for spatially distributed hydrogeologic and environmental processes
Christakos, G.; Olea, R.A.
1992-01-01
A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related to sampling problems in two dimensions. ?? 1992.
Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.
2017-01-01
A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479
Stoeckel, D.M.; Stelzer, E.A.; Dick, L.K.
2009-01-01
Quantitative PCR (qPCR), applied to complex environmental samples such as water, wastewater, and feces, is susceptible to methodological and sample related biases. In this study, we evaluated two exogenous DNA spike-and-recovery controls as proxies for recovery efficiency of Bacteroidales 16S rDNA gene sequences (AllBac and qHF183) that are used for microbial source tracking (MST) in river water. Two controls-(1) the plant pathogen Pantoea stewartii, carrying the chromosomal target gene cpsD, and (2) Escherichia coli, carrying the plasmid-borne target gene DsRed2-were added to raw water samples immediately prior to concentration and DNA extraction for qPCR. When applied to samples processed in replicate, recovery of each control was positively correlated with the observed concentration of each MST marker. Adjustment of MST marker concentrations according to recovery efficiency reduced variability in replicate analyses when consistent processing and extraction methodologies were applied. Although the effects of this procedure on accuracy could not be tested due to uncertainties in control DNA concentrations, the observed reduction in variability should improve the strength of statistical comparisons. These findings suggest that either of the tested spike-and-recovery controls can be useful to measure efficiency of extraction and recovery in routine laboratory processing. ?? 2009 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef
2017-05-01
This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
10 CFR 26.167 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-01-01
... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...
10 CFR 26.167 - Quality assurance and quality control.
Code of Federal Regulations, 2011 CFR
2011-01-01
... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...
10 CFR 26.167 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...
10 CFR 26.167 - Quality assurance and quality control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...
10 CFR 26.167 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...
Microcomputer Processing and Analysis of Sample Survey in Education. A Methodological Case Study.
ERIC Educational Resources Information Center
Guo, Sheng
This report discusses the methods, techniques, and software applications used in processing the data gathered in a survey of the physical condition and health of students in Guangdong Province, China. The introduction provides background on the survey. Survey grouping, data items, and survey procedures are then described. A discussion of…
Household Energy Consumption Segmentation Using Hourly Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwac, J; Flora, J; Rajagopal, R
2014-01-01
The increasing US deployment of residential advanced metering infrastructure (AMI) has made hourly energy consumption data widely available. Using CA smart meter data, we investigate a household electricity segmentation methodology that uses an encoding system with a pre-processed load shape dictionary. Structured approaches using features derived from the encoded data drive five sample program and policy relevant energy lifestyle segmentation strategies. We also ensure that the methodologies developed scale to large data sets.
Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M
2015-10-10
The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.
Using Six Sigma to improve once daily gentamicin dosing and therapeutic drug monitoring performance.
Egan, Sean; Murphy, Philip G; Fennell, Jerome P; Kelly, Sinead; Hickey, Mary; McLean, Carolyn; Pate, Muriel; Kirke, Ciara; Whiriskey, Annette; Wall, Niall; McCullagh, Eddie; Murphy, Joan; Delaney, Tim
2012-12-01
Safe, effective therapy with the antimicrobial gentamicin requires good practice in dose selection and monitoring of serum levels. Suboptimal therapy occurs with breakdown in the process of drug dosing, serum blood sampling, laboratory processing and level interpretation. Unintentional underdosing may result. This improvement effort aimed to optimise this process in an academic teaching hospital using Six Sigma process improvement methodology. A multidisciplinary project team was formed. Process measures considered critical to quality were defined, and baseline practice was examined through process mapping and audit. Root cause analysis informed improvement measures. These included a new dosing and monitoring schedule, and standardised assay sampling and drug administration timing which maximised local capabilities. Three iterations of the improvement cycle were conducted over a 24-month period. The attainment of serum level sampling in the required time window improved by 85% (p≤0.0001). A 66% improvement in accuracy of dosing was observed (p≤0.0001). Unnecessary dose omission while awaiting level results and inadvertent disruption to therapy due to dosing and monitoring process breakdown were eliminated. Average daily dose administered increased from 3.39 mg/kg to 4.78 mg/kg/day. Using Six Sigma methodology enhanced gentamicin usage process performance. Local process related factors may adversely affect adherence to practice guidelines for gentamicin, a drug which is complex to use. It is vital to adapt dosing guidance and monitoring requirements so that they are capable of being implemented in the clinical environment as a matter of routine. Improvement may be achieved through a structured localised approach with multidisciplinary stakeholder involvement.
How Is This Flower Pollinated? A Polyclave Key to Use in Teaching.
ERIC Educational Resources Information Center
Tyrrell, Lucy
1989-01-01
Presents an identification method which uses the process of elimination to identify pollination systems. Provides the polyclave key, methodology for using the key, a sample worksheet, and abbreviation codes for pollination systems. (MVL)
Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi
2016-12-15
Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.
Argumentation: A Methodology to Facilitate Critical Thinking.
Makhene, Agnes
2017-06-20
Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Techniques of preprocessing, interpretation, classification, and ground truth sampling were studied. It has shown the need for a low cost, low level technology, viable, operational methodology to replace the emphasis given in the U.S. to machine processing, which many developing countries cannot afford, understand, nor implement.
Vargas, E; Ruiz, M A; Campuzano, S; Reviejo, A J; Pingarrón, J M
2016-03-31
A non-destructive, rapid and simple to use sensing method for direct determination of glucose in non-processed fruits is described. The strategy involved on-line microdialysis sampling coupled with a continuous flow system with amperometric detection at an enzymatic biosensor. Apart from direct determination of glucose in fruit juices and blended fruits, this work describes for the first time the successful application of an enzymatic biosensor-based electrochemical approach to the non-invasive determination of glucose in raw fruits. The methodology correlates, through previous calibration set-up, the amperometric signal generated from glucose in non-processed fruits with its content in % (w/w). The comparison of the obtained results using the proposed approach in different fruits with those provided by other method involving the same commercial biosensor as amperometric detector in stirred solutions pointed out that there were no significant differences. Moreover, in comparison with other available methodologies, this microdialysis-coupled continuous flow system amperometric biosensor-based procedure features straightforward sample preparation, low cost, reduced assay time (sampling rate of 7 h(-1)) and ease of automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Life Design Counseling Group Intervention with Portuguese Adolescents: A Process and Outcome Study
ERIC Educational Resources Information Center
Cardoso, Paulo; Janeiro, Isabel Nunes; Duarte, Maria Eduarda
2018-01-01
This article examines the process and outcome of a life design counseling group intervention with students in Grades 9 and 12. First, we applied a quasi-experimental methodology to analyze the intervention's effectiveness in promoting career certainty, career decision-making, self-efficacy, and career adaptability in a sample of 236 students.…
ERIC Educational Resources Information Center
Salami, Samuel O.; Aremu, A. Oyesoji
2007-01-01
Purpose: The purpose of this paper was to investigate the relationships of parental attachment and psychological separation to the career development process of secondary school adolescents. Design/methodology/approach: An "ex post facto" survey research design was adopted. The sample comprised 242 (males = 121, females = 121) senior…
Analysis of effects of impurities intentionally incorporated into silicon
NASA Technical Reports Server (NTRS)
Uno, F.
1977-01-01
A methodology was developed and implemented to allow silicon samples containing intentionally incorporated impurities to be fabricated into finished solar cells under carefully controlled conditions. The electrical and spectral properties were then measured for each group processed.
A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan
2018-04-01
This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.
Shirazi, Mohammadali; Reddy Geedipally, Srinivas; Lord, Dominique
2017-01-01
Severity distribution functions (SDFs) are used in highway safety to estimate the severity of crashes and conduct different types of safety evaluations and analyses. Developing a new SDF is a difficult task and demands significant time and resources. To simplify the process, the Highway Safety Manual (HSM) has started to document SDF models for different types of facilities. As such, SDF models have recently been introduced for freeway and ramps in HSM addendum. However, since these functions or models are fitted and validated using data from a few selected number of states, they are required to be calibrated to the local conditions when applied to a new jurisdiction. The HSM provides a methodology to calibrate the models through a scalar calibration factor. However, the proposed methodology to calibrate SDFs was never validated through research. Furthermore, there are no concrete guidelines to select a reliable sample size. Using extensive simulation, this paper documents an analysis that examined the bias between the 'true' and 'estimated' calibration factors. It was indicated that as the value of the true calibration factor deviates further away from '1', more bias is observed between the 'true' and 'estimated' calibration factors. In addition, simulation studies were performed to determine the calibration sample size for various conditions. It was found that, as the average of the coefficient of variation (CV) of the 'KAB' and 'C' crashes increases, the analyst needs to collect a larger sample size to calibrate SDF models. Taking this observation into account, sample-size guidelines are proposed based on the average CV of crash severities that are used for the calibration process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Marin, Tania; Taylor, Anne Winifred; Grande, Eleonora Dal; Avery, Jodie; Tucker, Graeme; Morey, Kim
2015-05-19
The considerably lower average life expectancy of Aboriginal and Torres Strait Islander Australians, compared with non-Aboriginal and non-Torres Strait Islander Australians, has been widely reported. Prevalence data for chronic disease and health risk factors are needed to provide evidence based estimates for Australian Aboriginal and Torres Strait Islanders population health planning. Representative surveys for these populations are difficult due to complex methodology. The focus of this paper is to describe in detail the methodological challenges and resolutions of a representative South Australian Aboriginal population-based health survey. Using a stratified multi-stage sampling methodology based on the Australian Bureau of Statistics 2006 Census with culturally appropriate and epidemiological rigorous methods, 11,428 randomly selected dwellings were approached from a total of 209 census collection districts. All persons eligible for the survey identified as Aboriginal and/or Torres Strait Islander and were selected from dwellings identified as having one or more Aboriginal person(s) living there at the time of the survey. Overall, the 399 interviews from an eligible sample of 691 SA Aboriginal adults yielded a response rate of 57.7%. These face-to-face interviews were conducted by ten interviewers retained from a total of 27 trained Aboriginal interviewers. Challenges were found in three main areas: identification and recruitment of participants; interviewer recruitment and retainment; and using appropriate engagement with communities. These challenges were resolved, or at least mainly overcome, by following local protocols with communities and their representatives, and reaching agreement on the process of research for Aboriginal people. Obtaining a representative sample of Aboriginal participants in a culturally appropriate way was methodologically challenging and required high levels of commitment and resources. Adhering to these principles has resulted in a rich and unique data set that provides an overview of the self-reported health status for Aboriginal people living in South Australia. This process provides some important principles to be followed when engaging with Aboriginal people and their communities for the purpose of health research.
Weld defect identification in friction stir welding using power spectral density
NASA Astrophysics Data System (ADS)
Das, Bipul; Pal, Sukhomay; Bag, Swarup
2018-04-01
Power spectral density estimates are powerful in extraction of useful information retained in signal. In the current research work classical periodogram and Welch periodogram algorithms are used for the estimation of power spectral density for vertical force signal and transverse force signal acquired during friction stir welding process. The estimated spectral densities reveal notable insight in identification of defects in friction stir welded samples. It was observed that higher spectral density against each process signals is a key indication in identifying the presence of possible internal defects in the welded samples. The developed methodology can offer preliminary information regarding presence of internal defects in friction stir welded samples can be best accepted as first level of safeguard in monitoring the friction stir welding process.
Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N
2015-03-01
A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.
Quantifying chemical reactions by using mixing analysis.
Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao
2015-01-01
This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point. Copyright © 2014 Elsevier B.V. All rights reserved.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shine, E. P.; Poirier, M. R.
2013-10-29
Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and datamore » interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and statisticians used carefully thought out designs that systematically and economically provided plans for data collection from the DWPF process. Key shared features of the sampling designs used at DWPF and the Gy sampling methodology were the specification of a standard for sample representativeness, an investigation that produced data from the process to study the sampling function, and a decision framework used to assess whether the specification was met based on the data. Without going into detail with regard to the seven errors identified by Pierre Gy, as excellent summaries are readily available such as Pitard [1989] and Smith [2001], SRS engineers understood, for example, that samplers can be biased (Gy's extraction error), and developed plans to mitigate those biases. Experiments that compared installed samplers with more representative samples obtained directly from the tank may not have resulted in systematically partitioning sampling errors into the now well-known error categories of Gy, but did provide overall information on the suitability of sampling systems. Most of the designs in this report are related to the DWPF vessels, not the large SRS Tank Farm tanks. Samples from the DWPF Slurry Mix Evaporator (SME), which contains the feed to the DWPF melter, are characterized using standardized analytical methods with known uncertainty. The analytical error is combined with the established error from sampling and processing in DWPF to determine the melter feed composition. This composition is used with the known uncertainty of the models in the Product Composition Control System (PCCS) to ensure that the wasteform that is produced is comfortably within the acceptable processing and product performance region. Having the advantage of many years of processing that meets the waste glass product acceptance criteria, the DWPF process has provided a considerable amount of data about itself in addition to the data from many special studies. Demonstrating representative sampling directly from the large Tank Farm tanks is a difficult, if not unsolvable enterprise due to limited accessibility. However, the consistency and the adequacy of sampling and mixing at SRS could at least be studied under the controlled process conditions based on samples discussed by Ray and others [2012a] in Waste Form Qualification Report (WQR) Volume 2 and the transfers from Tanks 40H and 51H to the Sludge Receipt and Adjustment Tank (SRAT) within DWPF. It is important to realize that the need for sample representativeness becomes more stringent as the material gets closer to the melter, and the tanks within DWPF have been studied extensively to meet those needs.« less
How to: identify non-tuberculous Mycobacterium species using MALDI-TOF mass spectrometry.
Alcaide, F; Amlerová, J; Bou, G; Ceyssens, P J; Coll, P; Corcoran, D; Fangous, M-S; González-Álvarez, I; Gorton, R; Greub, G; Hery-Arnaud, G; Hrábak, J; Ingebretsen, A; Lucey, B; Marekoviċ, I; Mediavilla-Gradolph, C; Monté, M R; O'Connor, J; O'Mahony, J; Opota, O; O'Reilly, B; Orth-Höller, D; Oviaño, M; Palacios, J J; Palop, B; Pranada, A B; Quiroga, L; Rodríguez-Temporal, D; Ruiz-Serrano, M J; Tudó, G; Van den Bossche, A; van Ingen, J; Rodriguez-Sanchez, B
2018-06-01
The implementation of MALDI-TOF MS for microorganism identification has changed the routine of the microbiology laboratories as we knew it. Most microorganisms can now be reliably identified within minutes using this inexpensive, user-friendly methodology. However, its application in the identification of mycobacteria isolates has been hampered by the structure of their cell wall. Improvements in the sample processing method and in the available database have proved key factors for the rapid and reliable identification of non-tuberculous mycobacteria isolates using MALDI-TOF MS. The main objective is to provide information about the proceedings for the identification of non-tuberculous isolates using MALDI-TOF MS and to review different sample processing methods, available databases, and the interpretation of the results. Results from relevant studies on the use of the available MALDI-TOF MS instruments, the implementation of innovative sample processing methods, or the implementation of improved databases are discussed. Insight about the methodology required for reliable identification of non-tuberculous mycobacteria and its implementation in the microbiology laboratory routine is provided. Microbiology laboratories where MALDI-TOF MS is available can benefit from its capacity to identify most clinically interesting non-tuberculous mycobacteria in a rapid, reliable, and inexpensive manner. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele
2015-01-01
In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers. PMID:28793427
Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele
2015-07-07
In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.
Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Leino-Kilpi, Helena
2015-12-01
To report a review of quality regarding sampling, sample and data collection procedures of empirical nursing research of ethical climate studies where nurses were informants. Surveys are needed to obtain generalisable information about topics sensitive to nursing. Methodological quality of the studies is of key concern, especially the description of sampling and data collection procedures. Methodological literature review. Using the electronic MEDLINE database, empirical nursing research articles focusing on ethical climate were accessed in 2013 (earliest-22 November 2013). Using the search terms 'ethical' AND ('climate*' OR 'environment*') AND ('nurse*' OR 'nursing'), 376 citations were retrieved. Based on a four-phase retrieval process, 26 studies were included in the detailed analysis. Sampling method was reported in 58% of the studies, and it was random in a minority of the studies (26%). The identification of target sample and its size (92%) was reported, whereas justification for sample size was less often given. In over two-thirds (69%) of the studies with identifiable response rate, it was below 75%. A variety of data collection procedures were used with large amount of missing data about the details of who distributed, recruited and collected the questionnaires. Methods to increase response rates were seldom described. Discussion about nonresponse, representativeness of the sample and generalisability of the results was missing in many studies. This review highlights the methodological challenges and developments that need to be considered in ensuring the use of valid information in developing health care through research findings. © 2015 Nordic College of Caring Science.
NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE
This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...
Application of Advanced Nondestructive Evaluation Techniques for Cylindrical Composite Test Samples
NASA Technical Reports Server (NTRS)
Martin, Richard E.; Roth, Donald J.; Salem, Jonathan A.
2013-01-01
Two nondestructive methods were applied to composite cylinder samples pressurized to failure in order to determine manufacturing quality and monitor damage progression under load. A unique computed tomography (CT) image processing methodology developed at NASA Glenn Research was used to assess the condition of the as-received samples while acoustic emission (AE) monitoring was used to identify both the extent and location of damage within the samples up to failure. Results show the effectiveness of both of these methods in identifying potentially critical fabrication issues and their resulting impact on performance.
Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dilley, Lorie M.
2015-04-13
The purpose of this project was to: 1) evaluate the relationship between geothermal fluid processes and the compositions of the fluid inclusion gases trapped in the reservoir rocks; and 2) develop methodologies for interpreting fluid inclusion gas data in terms of the chemical, thermal and hydrological properties of geothermal reservoirs. Phase 1 of this project was designed to conduct the following: 1) model the effects of boiling, condensation, conductive cooling and mixing on selected gaseous species; using fluid compositions obtained from geothermal wells, 2) evaluate, using quantitative analyses provided by New Mexico Tech (NMT), how these processes are recorded bymore » fluid inclusions trapped in individual crystals; and 3) determine if the results obtained on individual crystals can be applied to the bulk fluid inclusion analyses determined by Fluid Inclusion Technology (FIT). Our initial studies however, suggested that numerical modeling of the data would be premature. We observed that the gas compositions, determined on bulk and individual samples were not the same as those discharged by the geothermal wells. Gases discharged from geothermal wells are CO 2-rich and contain low concentrations of light gases (i.e. H 2, He, N, Ar, CH4). In contrast many of our samples displayed enrichments in these light gases. Efforts were initiated to evaluate the reasons for the observed gas distributions. As a first step, we examined the potential importance of different reservoir processes using a variety of commonly employed gas ratios (e.g. Giggenbach plots). The second technical target was the development of interpretational methodologies. We have develop methodologies for the interpretation of fluid inclusion gas data, based on the results of Phase 1, geologic interpretation of fluid inclusion data, and integration of the data. These methodologies can be used in conjunction with the relevant geological and hydrological information on the system to create fluid models for the system. The hope is that the methodologies developed will allow bulk fluid inclusion gas analysis to be a useful tool for estimating relative temperatures, identifying the sources and origins of the geothermal fluids, and developing conceptual models that can be used to help target areas of enhanced permeability.« less
1995 American travel survey : an overview of the survey design and methodology
DOT National Transportation Integrated Search
1995-01-01
This paper describes the methods used in the 1995 ATS. The introduction provides an overview of : the purpose and objectives of the survey followed by a description of the survey and sample designs, survey field operations, and processing of survey d...
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-10-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-01-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pombet, Denis; Desnoyers, Yvon; Charters, Grant
2013-07-01
The TruPro{sup R} process enables to collect a significant number of samples to characterize radiological materials. This innovative and alternative technique is experimented for the ANDRA quality-control inspection of cemented packages. It proves to be quicker and more prolific than the current methodology. Using classical statistics and geo-statistics approaches, the physical and radiological characteristics of two hulls containing immobilized wastes (sludges or concentrates) in a hydraulic binder are assessed in this paper. The waste homogeneity is also evaluated in comparison to ANDRA criterion. Sensibility to sample size (support effect), presence of extreme values, acceptable deviation rate and minimum number ofmore » data are discussed. The final objectives are to check the homogeneity of the two characterized radwaste packages and also to validate and reinforce this alternative characterization methodology. (authors)« less
Bouchard, Daniel; Wanner, Philipp; Luo, Hong; McLoughlin, Patrick W; Henderson, James K; Pirkle, Robert J; Hunkeler, Daniel
2017-10-20
The methodology of the solvent-based dissolution method used to sample gas phase volatile organic compounds (VOC) for compound-specific isotope analysis (CSIA) was optimized to lower the method detection limits for TCE and benzene. The sampling methodology previously evaluated by [1] consists in pulling the air through a solvent to dissolve and accumulate the gaseous VOC. After the sampling process, the solvent can then be treated similarly as groundwater samples to perform routine CSIA by diluting an aliquot of the solvent into water to reach the required concentration of the targeted contaminant. Among solvents tested, tetraethylene glycol dimethyl ether (TGDE) showed the best aptitude for the method. TGDE has a great affinity with TCE and benzene, hence efficiently dissolving the compounds during their transition through the solvent. The method detection limit for TCE (5±1μg/m 3 ) and benzene (1.7±0.5μg/m 3 ) is lower when using TGDE compared to methanol, which was previously used (385μg/m 3 for TCE and 130μg/m 3 for benzene) [2]. The method detection limit refers to the minimal gas phase concentration in ambient air required to load sufficient VOC mass into TGDE to perform δ 13 C analysis. Due to a different analytical procedure, the method detection limit associated with δ 37 Cl analysis was found to be 156±6μg/m 3 for TCE. Furthermore, the experimental results validated the relationship between the gas phase TCE and the progressive accumulation of dissolved TCE in the solvent during the sampling process. Accordingly, based on the air-solvent partitioning coefficient, the sampling methodology (e.g. sampling rate, sampling duration, amount of solvent) and the final TCE concentration in the solvent, the concentration of TCE in the gas phase prevailing during the sampling event can be determined. Moreover, the possibility to analyse for TCE concentration in the solvent after sampling (or other targeted VOCs) allows the field deployment of the sampling method without the need to determine the initial gas phase TCE concentration. The simplified field deployment approach of the solvent-based dissolution method combined with the conventional analytical procedure used for groundwater samples substantially facilitates the application of CSIA to gas phase studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Methodology for Augmenting Existing Paths with Additional Parallel Transects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, John E.
2013-09-30
Visual Sample Plan (VSP) is sample planning software that is used, among other purposes, to plan transect sampling paths to detect areas that were potentially used for munition training. This module was developed for application on a large site where existing roads and trails were to be used as primary sampling paths. Gap areas between these primary paths needed to found and covered with parallel transect paths. These gap areas represent areas on the site that are more than a specified distance from a primary path. These added parallel paths needed to optionally be connected together into a single path—themore » shortest path possible. The paths also needed to optionally be attached to existing primary paths, again with the shortest possible path. Finally, the process must be repeatable and predictable so that the same inputs (primary paths, specified distance, and path options) will result in the same set of new paths every time. This methodology was developed to meet those specifications.« less
Vernazza, Christopher R; Carr, Katherine; Wildman, John; Gray, Joanne; Holmes, Richard D; Exley, Catherine; Smith, Robert A; Donaldson, Cam
2018-06-22
Resources in any healthcare systems are scarce relative to need and therefore choices need to be made which often involve difficult decisions about the best allocation of these resources. One pragmatic and robust tool to aid resource allocation is Programme Budgeting and Marginal Analysis (PBMA), but there is mixed evidence on its uptake and effectiveness. Furthermore, there is also no evidence on the incorporation of the preferences of a large and representative sample of the general public into such a process. The study therefore aims to undertake, evaluate and refine a PBMA process within the exemplar of NHS dentistry in England whilst also using an established methodology (Willingness to Pay (WTP)) to systematically gather views from a representative sample of the public. Stakeholders including service buyers (commissioners), dentists, dental public health representatives and patient representatives will be recruited to participate in a PBMA process involving defining current spend, agreeing criteria to judge services/interventions, defining areas for investment and disinvestment, rating these areas against the criteria and making final recommendations. The process will be refined based on participatory action research principles and evaluated through semi-structured interviews, focus groups and observation of the process by the research team. In parallel a representative sample of English adults will be recruited to complete a series of four surveys including WTP valuations of programmes being considered by the PBMA panel. In addition a methodological experiment comparing two ways of eliciting WTP will be undertaken. The project will allow the PBMA process and particularly the use of WTP within it to be investigated and developed. There will be challenges around engagement with the task by the panel undertaking it and with the outputs by stakeholders but careful relationship building will help to mitigate this. The large volume of data will be managed through careful segmenting of the analysis and the use of the well-established Framework approach to qualitative data analysis. WTP has various potential biases but the elicitation will be carefully designed to minimise these and some methodological investigation will take place.
Application of gamma-ray spectrometry in a NORM industry for its radiometrical characterization
NASA Astrophysics Data System (ADS)
Mantero, J.; Gázquez, M. J.; Hurtado, S.; Bolívar, J. P.; García-Tenorio, R.
2015-11-01
Industrial activities involving Naturally Occurring Radioactive Materials (NORM) are found among the most important industrial sectors worldwide as oil/gas facilities, metal production, phosphate Industry, zircon treatment, etc. being really significant the radioactive characterization of the materials involved in their production processes in order to assess the potential radiological risk for workers or natural environment. High resolution gamma spectrometry is a versatile non-destructive radiometric technique that makes simultaneous determination of several radionuclides possible with little sample preparation. However NORM samples cover a wide variety of densities and composition, as opposed to the standards used in gamma efficiency calibration, which are either water-based solutions or standard/reference sources of similar composition. For that reason self-absorption correction effects (especially in the low energy range) must be considered individually in every sample. In this work an experimental and a semi-empirical methodology of self-absorption correction were applied to NORM samples, and the obtained results compared critically, in order to establish the best practice in relation to the circumstances of an individual laboratory. This methodology was applied in samples coming from a TiO2 factory (NORM industry) located in the south-west of Spain where activity concentration of several radionuclides from the Uranium and Thorium series through the production process was measured. These results will be shown in this work.
Knopman, Debra S.; Voss, Clifford I.; Garabedian, Stephen P.
1991-01-01
Tests of a one-dimensional sampling design methodology on measurements of bromide concentration collected during the natural gradient tracer test conducted by the U.S. Geological Survey on Cape Cod, Massachusetts, demonstrate its efficacy for field studies of solute transport in groundwater and the utility of one-dimensional analysis. The methodology was applied to design of sparse two-dimensional networks of fully screened wells typical of those often used in engineering practice. In one-dimensional analysis, designs consist of the downstream distances to rows of wells oriented perpendicular to the groundwater flow direction and the timing of sampling to be carried out on each row. The power of a sampling design is measured by its effectiveness in simultaneously meeting objectives of model discrimination, parameter estimation, and cost minimization. One-dimensional models of solute transport, differing in processes affecting the solute and assumptions about the structure of the flow field, were considered for description of tracer cloud migration. When fitting each model using nonlinear regression, additive and multiplicative error forms were allowed for the residuals which consist of both random and model errors. The one-dimensional single-layer model of a nonreactive solute with multiplicative error was judged to be the best of those tested. Results show the efficacy of the methodology in designing sparse but powerful sampling networks. Designs that sample five rows of wells at five or fewer times in any given row performed as well for model discrimination as the full set of samples taken up to eight times in a given row from as many as 89 rows. Also, designs for parameter estimation judged to be good by the methodology were as effective in reducing the variance of parameter estimates as arbitrary designs with many more samples. Results further showed that estimates of velocity and longitudinal dispersivity in one-dimensional models based on data from only five rows of fully screened wells each sampled five or fewer times were practically equivalent to values determined from moments analysis of the complete three-dimensional set of 29,285 samples taken during 16 sampling times.
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...
Framework for managing mycotoxin risks in the food industry.
Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie
2014-12-01
We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.
RAPID DETECTION METHOD FOR E.COLI, ENTEROCOCCI AND BACTEROIDES IN RECREATIONAL WATER
Current methodology for determining fecal contamination of drinking water sources and recreational waters rely on the time-consuming process of bacterial multiplication and require at least 24 hours from the time of sampling to the possible determination that the water is unsafe ...
Tack, Lois C; Thomas, Michelle; Reich, Karl
2007-03-01
Forensic labs globally face the same problem-a growing need to process a greater number and wider variety of samples for DNA analysis. The same forensic lab can be tasked all at once with processing mixed casework samples from crime scenes, convicted offender samples for database entry, and tissue from tsunami victims for identification. Besides flexibility in the robotic system chosen for forensic automation, there is a need, for each sample type, to develop new methodology that is not only faster but also more reliable than past procedures. FTA is a chemical treatment of paper, unique to Whatman Bioscience, and is used for the stabilization and storage of biological samples. Here, the authors describe optimization of the Whatman FTA Purification Kit protocol for use with the AmpFlSTR Identifiler PCR Amplification Kit.
Evaluation of soil water stable isotope analysis by H2O(liquid)-H2O(vapor) equilibration method
NASA Astrophysics Data System (ADS)
Gralher, Benjamin; Stumpp, Christine
2014-05-01
Environmental tracers like stable isotopes of water (δ18O, δ2H) have proven to be valuable tools to study water flow and transport processes in soils. Recently, a new technique for soil water isotope analysis has been developed that employs a vapor phase being in isothermal equilibrium with the liquid phase of interest. This has increased the potential application of water stable isotopes in unsaturated zone studies as it supersedes laborious extraction of soil water. However, uncertainties of analysis and influencing factors need to be considered. Therefore, the objective of this study was to evaluate different methodologies of analysing stable isotopes in soil water in order to reduce measurement uncertainty. The methodologies included different preparation procedures of soil cores for equilibration of vapor and soil water as well as raw data correction. Two different inflatable sample containers (freezer bags, bags containing a metal layer) and equilibration atmospheres (N2, dry air) were tested. The results showed that uncertainties for δ18O were higher compared to δ2H that cannot be attributed to any specific detail of the processing routine. Particularly, soil samples with high contents of organic matter showed an apparent isotope enrichment which is indicative for fractionation due to evaporation. However, comparison of water samples obtained from suction cups with the local meteoric water line indicated negligible fractionation processes in the investigated soils. Therefore, a method was developed to correct the raw data reducing the uncertainties of the analysis.. We conclude that the evaluated method is advantageous over traditional methods regarding simplicity, resource requirements and sample throughput but careful consideration needs to be made regarding sample handling and data processing. Thus, stable isotopes of water are still a good tool to determine water flow and transport processes in the unsaturated zone.
Hepburn, Sophie; Cairns, David A; Jackson, David; Craven, Rachel A; Riley, Beverley; Hutchinson, Michelle; Wood, Steven; Smith, Matthew Welberry; Thompson, Douglas; Banks, Rosamonde E
2015-06-01
We have examined the impact of sample processing time delay, temperature, and the addition of protease inhibitors (PIs) on the urinary proteome and peptidome, an important aspect of biomarker studies. Ten urine samples from patients with varying pathologies were each divided and PIs added to one-half, with aliquots of each then processed and frozen immediately, or after a delay of 6 h at 4°C or room temperature (20-22°C), effectively yielding 60 samples in total. Samples were then analyzed by 2D-PAGE, SELDI-TOF-MS, and immunoassay. Interindividual variability in profiles was the dominant feature in all analyses. Minimal changes were observed by 2D-PAGE as a result of delay in processing, temperature, or PIs and no changes were seen in IgG, albumin, β2 -microglobulin, or α1 -microglobulin measured by immunoassay. Analysis of peptides showed clustering of some samples by presence/absence of PIs but the extent was very patient-dependent with most samples showing minimal effects. The extent of processing-induced changes and the benefit of PI addition are patient- and sample-dependent. A consistent processing methodology is essential within a study to avoid any confounding of the results. © 2014 The Authors PROTEOMICS Clinical Applications Published by Wiley-VCH Verlag GmbH & Co. KGaA.
BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1
Gary Chan, Kwun Chuen; Wang, Mei-Cheng
2011-01-01
Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Applying a contemporary grounded theory methodology.
Licqurish, Sharon; Seibold, Carmel
2011-01-01
The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.
Ligand diffusion in proteins via enhanced sampling in molecular dynamics.
Rydzewski, J; Nowak, W
2017-12-01
Computational simulations in biophysics describe the dynamics and functions of biological macromolecules at the atomic level. Among motions particularly important for life are the transport processes in heterogeneous media. The process of ligand diffusion inside proteins is an example of a complex rare event that can be modeled using molecular dynamics simulations. The study of physical interactions between a ligand and its biological target is of paramount importance for the design of novel drugs and enzymes. Unfortunately, the process of ligand diffusion is difficult to study experimentally. The need for identifying the ligand egress pathways and understanding how ligands migrate through protein tunnels has spurred the development of several methodological approaches to this problem. The complex topology of protein channels and the transient nature of the ligand passage pose difficulties in the modeling of the ligand entry/escape pathways by canonical molecular dynamics simulations. In this review, we report a methodology involving a reconstruction of the ligand diffusion reaction coordinates and the free-energy profiles along these reaction coordinates using enhanced sampling of conformational space. We illustrate the above methods on several ligand-protein systems, including cytochromes and G-protein-coupled receptors. The methods are general and may be adopted to other transport processes in living matter. Copyright © 2017 Elsevier B.V. All rights reserved.
Risk-based Methodology for Validation of Pharmaceutical Batch Processes.
Wiles, Frederick
2013-01-01
In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.
Auditory Backward Masking Deficits in Children with Reading Disabilities
ERIC Educational Resources Information Center
Montgomery, Christine R.; Morris, Robin D.; Sevcik, Rose A.; Clarkson, Marsha G.
2005-01-01
Studies evaluating temporal auditory processing among individuals with reading and other language deficits have yielded inconsistent findings due to methodological problems (Studdert-Kennedy & Mody, 1995) and sample differences. In the current study, seven auditory masking thresholds were measured in fifty-two 7- to 10-year-old children (26…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guan, Xuefei; Zhou, S. Kevin; Rasselkorde, El Mahjoub
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location.more » The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.« less
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Rasselkorde, El Mahjoub; Abbasi, Waheed; Zhou, S. Kevin
2015-03-01
The study presents a data processing methodology for weld build-up using multiple scan patterns. To achieve an overall high probability of detection for flaws with different orientations, an inspection procedure with three different scan patterns is proposed. The three scan patterns are radial-tangential longitude wave pattern, axial-radial longitude wave pattern, and tangential shear wave pattern. Scientific fusion of the inspection data is implemented using volume reconstruction techniques. The idea is to perform spatial domain forward data mapping for all sampling points. A conservative scheme is employed to handle the case that multiple sampling points are mapped to one grid location. The scheme assigns the maximum value for the grid location to retain the largest equivalent reflector size for the location. The methodology is demonstrated and validated using a realistic ring of weld build-up. Tungsten balls and bars are embedded to the weld build-up during manufacturing process to represent natural flaws. Flat bottomed holes and side drilled holes are installed as artificial flaws. Automatic flaw identification and extraction are demonstrated. Results indicate the inspection procedure with multiple scan patterns can identify all the artificial and natural flaws.
Mayer, B; Muche, R
2013-01-01
Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.
Assessing technical performance at diverse ambulatory care sites.
Osterweis, M; Bryant, E
1978-01-01
The purpose of the large study reported here was to develop and test methods for assessing the quality of health care that would be broadly applicable to diverse ambulatory care organizations for periodic comparative review. Methodological features included the use of an age-sex stratified random sampling scheme, dependence on medical records as the source of data, a fixed study period year, use of Kessner's tracer methodology (including not only acute and chronic diseases but also screening and immunization rates as indicators), and a fixed tracer matrix at all test sites. This combination of methods proved more efficacious in estimating certain parameters for the total patient populations at each site (including utilization patterns, screening, and immunization rates) and the process of care for acute conditions than it did in examining the process of care for the selected chronic condition. It was found that the actual process of care at all three sites for the three acute conditions (streptococcal pharyngitis, urinary tract infection, and iron deficiency anemia) often differed from the expected process in terms of both diagnostic procedures and treatment. For hypertension, the chronic disease tracer, medical records were frequently a deficient data source from which to draw conclusions about the adequacy of treatment. Several aspects of the study methodology were found to be detrimental to between-site comparisons of the process of care for chronic disease management. The use of an age-sex stratified random sampling scheme resulted in the identification of too few cases of hypertension at some sites for analytic purposes, thereby necessitating supplementary sampling by diagnosis. The use of a fixed study period year resulted in an arbitrary starting point in the course of the disease. Furthermore, in light of the diverse sociodemographic characteristics of the patient populations, the use of a fixed matrix of tracer conditions for all test sites is questionable. The discussion centers on these and other problems encountered in attempting to compare technical performance within diverse ambulatory care organizations and provides some guidelines as to the utility of alternative methods for assessing the quality of health care.
Methodological strategies in using home sleep apnea testing in research and practice.
Miller, Jennifer N; Schulz, Paula; Pozehl, Bunny; Fiedler, Douglas; Fial, Alissa; Berger, Ann M
2017-11-14
Home sleep apnea testing (HSAT) has increased due to improvements in technology, accessibility, and changes in third party reimbursement requirements. Research studies using HSAT have not consistently reported procedures and methodological challenges. This paper had two objectives: (1) summarize the literature on use of HSAT in research of adults and (2) identify methodological strategies to use in research and practice to standardize HSAT procedures and information. Search strategy included studies of participants undergoing sleep testing for OSA using HSAT. MEDLINE via PubMed, CINAHL, and Embase with the following search terms: "polysomnography," "home," "level III," "obstructive sleep apnea," and "out of center testing." Research articles that met inclusion criteria (n = 34) inconsistently reported methods and methodological challenges in terms of: (a) participant sampling; (b) instrumentation issues; (c) clinical variables; (d) data processing; and (e) patient acceptability. Ten methodological strategies were identified for adoption when using HSAT in research and practice. Future studies need to address the methodological challenges summarized in this paper as well as identify and report consistent HSAT procedures and information.
A systematic review of grounded theory studies in physiotherapy.
Ali, Nancy; May, Stephen; Grafton, Kate
2018-05-23
This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.
NASA Astrophysics Data System (ADS)
van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-02-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.
Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2017-01-01
Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842
Fluorescence Spectroscopy for the Monitoring of Food Processes.
Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd
Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.
do Nascimento, Cássio; dos Santos, Janine Navarro; Pedrazzi, Vinícius; Pita, Murillo Sucena; Monesi, Nadia; Ribeiro, Ricardo Faria; de Albuquerque, Rubens Ferreira
2014-01-01
Molecular diagnosis methods have been largely used in epidemiological or clinical studies to detect and quantify microbial species that may colonize the oral cavity in healthy or disease. The preservation of genetic material from samples remains the major challenge to ensure the feasibility of these methodologies. Long-term storage may compromise the final result. The aim of this study was to evaluate the effect of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization. Saliva and supragingival biofilm were taken from 10 healthy subjects, aliquoted (n=364) and processed according to proposed protocols: immediate processing and processed after 2 or 4 weeks, and 6 or 12 months of storage at 4°C, -20°C and -80°C. Either total or individual microbial counts were recorded in lower values for samples processed after 12 months of storage, irrespective of temperatures tested. Samples stored up to 6 months at cold temperatures showed similar counts to those immediately processed. The microbial incidence was also significantly reduced in samples stored during 12 months in all temperatures. Temperature and time of oral samples storage have relevant impact in the detection and quantification of bacterial and fungal species by Checkerboard DNA-DNA hybridization method. Samples should be processed immediately after collection or up to 6 months if conserved at cold temperatures to avoid false-negative results. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Clayton, Berwyn; Fisher, Thea; Harris, Roger; Bateman, Andrea; Brown, Mike
2008-01-01
This document supports the report "A Study in Difference: Structures and Cultures in Registered Training Organisations." The first section outlines the methodology used to undertake the research and covers the design of the research, sample details, the data collection process and the strategy for data analysis and reporting. The…
Achieving Gender Equity in Science Class: Shift from Competition to Cooperative Learning
ERIC Educational Resources Information Center
Esiobu, G. O.
2011-01-01
Purpose: This study aims to verify the impact of cooperative learning as an intervention strategy towards the achievement of peace, equality and equity in the science classroom as part of the democratic process necessary for sustainable development. Design/methodology/approach: The study sample comprised 56 SSS 2 students in one public…
Documentation for the 2003-04 Schools and Staffing Survey. NCES 2007-337
ERIC Educational Resources Information Center
Tourkin, Steven C.; Warner, Toni; Parmer, Randall; Cole, Cornette; Jackson, Betty; Zukerberg, Andrew; Cox, Shawna; Soderberg, Andrew
2007-01-01
This report serves as the survey documentation for the design and implementation of the 2003-04 Schools and Staffing Survey. Topics covered include the sample design, survey methodology, data collection procedures, data processing, response rates, imputation procedures, weighting and variance estimation, review of the quality of data, the types of…
ERIC Educational Resources Information Center
Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.
2010-01-01
ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…
Computational neural learning formalisms for manipulator inverse kinematics
NASA Technical Reports Server (NTRS)
Gulati, Sandeep; Barhen, Jacob; Iyengar, S. Sitharama
1989-01-01
An efficient, adaptive neural learning paradigm for addressing the inverse kinematics of redundant manipulators is presented. The proposed methodology exploits the infinite local stability of terminal attractors - a new class of mathematical constructs which provide unique information processing capabilities to artificial neural systems. For robotic applications, synaptic elements of such networks can rapidly acquire the kinematic invariances embedded within the presented samples. Subsequently, joint-space configurations, required to follow arbitrary end-effector trajectories, can readily be computed. In a significant departure from prior neuromorphic learning algorithms, this methodology provides mechanisms for incorporating an in-training skew to handle kinematics and environmental constraints.
NASA Astrophysics Data System (ADS)
Guerrero Prado, Patricio; Nguyen, Mai K.; Dumas, Laurent; Cohen, Serge X.
2017-01-01
Characterization and interpretation of flat ancient material objects, such as those found in archaeology, paleoenvironments, paleontology, and cultural heritage, have remained a challenging task to perform by means of conventional x-ray tomography methods due to their anisotropic morphology and flattened geometry. To overcome the limitations of the mentioned methodologies for such samples, an imaging modality based on Compton scattering is proposed in this work. Classical x-ray tomography treats Compton scattering data as noise in the image formation process, while in Compton scattering tomography the conditions are set such that Compton data become the principal image contrasting agent. Under these conditions, we are able, first, to avoid relative rotations between the sample and the imaging setup, and second, to obtain three-dimensional data even when the object is supported by a dense material by exploiting backscattered photons. Mathematically this problem is addressed by means of a conical Radon transform and its inversion. The image formation process and object reconstruction model are presented. The feasibility of this methodology is supported by numerical simulations.
NASA Astrophysics Data System (ADS)
Brookman, Tom; Whittaker, Thomas
2012-09-01
Stable isotope dendroclimatology using α-cellulose has unique potential to deliver multimillennial-scale, sub-annually resolved, terrestrial climate records. However, lengthy processing and analytical methods often preclude such reconstructions. Variants of the Brendel extraction method have reduced these limitations, providing fast, easy methods of isolating α-cellulose in some species. Here, we investigate application of Standard Brendel (SBrendel) variants to resinous soft-woods by treating samples of kauri (Agathis australis), ponderosa pine (Pinus ponderosa) and huon pine (Lagarastrobus franklinii), varying reaction vessel, temperature, boiling time and reagent volume. Numerous samples were visibly `under-processed' and Fourier Transform infrared spectroscopic (FTIR) investigation showed absorption peaks at 1520 cm-1 and ˜1600 cm-1 in those fibers suggesting residual lignin and retained resin respectively. Replicate analyses of all samples processed at high temperature yielded consistent δ13C and δ18O despite color and spectral variations. Spectra and isotopic data revealed that α-cellulose δ13C can be altered during processing, most likely due to chemical contamination from insufficient acetone removal, but is not systematically affected by methodological variation. Reagent amount, temperature and extraction time all influence δ18O, however, and our results demonstrate that different species may require different processing methods. FTIR prior to isotopic analysis is a fast and cost effective way to determine α-cellulose extract purity. Furthermore, a systematic isotopic test such as we present here can also determine sensitivity of isotopic values to methodological variables. Without these tests, isotopic variability introduced by the method could obscure or `create' climatic signals within a data set.
Rapid development of xylanase assay conditions using Taguchi methodology.
Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath
2016-11-01
The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.
A time series model: First-order integer-valued autoregressive (INAR(1))
NASA Astrophysics Data System (ADS)
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
Hepburn, Sophie; Cairns, David A.; Jackson, David; Craven, Rachel A.; Riley, Beverley; Hutchinson, Michelle; Wood, Steven; Smith, Matthew Welberry; Thompson, Douglas
2015-01-01
Purpose We have examined the impact of sample processing time delay, temperature, and the addition of protease inhibitors (PIs) on the urinary proteome and peptidome, an important aspect of biomarker studies. Experimental design Ten urine samples from patients with varying pathologies were each divided and PIs added to one‐half, with aliquots of each then processed and frozen immediately, or after a delay of 6 h at 4°C or room temperature (20–22°C), effectively yielding 60 samples in total. Samples were then analyzed by 2D‐PAGE, SELDI‐TOF‐MS, and immunoassay. Results Interindividual variability in profiles was the dominant feature in all analyses. Minimal changes were observed by 2D‐PAGE as a result of delay in processing, temperature, or PIs and no changes were seen in IgG, albumin, β2‐microglobulin, or α1‐microglobulin measured by immunoassay. Analysis of peptides showed clustering of some samples by presence/absence of PIs but the extent was very patient‐dependent with most samples showing minimal effects. Conclusions and clinical relevance The extent of processing‐induced changes and the benefit of PI addition are patient‐ and sample‐dependent. A consistent processing methodology is essential within a study to avoid any confounding of the results. PMID:25400092
Manterola, Carlos; Torres, Rodrigo; Burgos, Luis; Vial, Manuel; Pineda, Viviana
2006-07-01
Surgery is a curative treatment for gastric cancer (GC). As relapse is frequent, adjuvant therapies such as postoperative chemo radiotherapy have been tried. In Chile, some hospitals adopted Macdonald's study as a protocol for the treatment of GC. To determine methodological quality and internal and external validity of the Macdonald study. Three instruments were applied that assess methodological quality. A critical appraisal was done and the internal and external validity of the methodological quality was analyzed with two scales: MINCIR (Methodology and Research in Surgery), valid for therapy studies and CONSORT (Consolidated Standards of Reporting Trials), valid for randomized controlled trials (RCT). Guides and scales were applied by 5 researchers with training in clinical epidemiology. The reader's guide verified that the Macdonald study was not directed to answer a clearly defined question. There was random assignment, but the method used is not described and the patients were not considered until the end of the study (36% of the group with surgery plus chemo radiotherapy did not complete treatment). MINCIR scale confirmed a multicentric RCT, not blinded, with an unclear randomized sequence, erroneous sample size estimation, vague objectives and no exclusion criteria. CONSORT system proved the lack of working hypothesis and specific objectives as well as an absence of exclusion criteria and identification of the primary variable, an imprecise estimation of sample size, ambiguities in the randomization process, no blinding, an absence of statistical adjustment and the omission of a subgroup analysis. The instruments applied demonstrated methodological shortcomings that compromise the internal and external validity of the.
[Methodological Aspects of the Sampling Design for the 2015 National Mental Health Survey].
Rodríguez, Nelcy; Rodríguez, Viviana Alejandra; Ramírez, Eugenia; Cediel, Sandra; Gil, Fabián; Rondón, Martín Alonso
2016-12-01
The WHO has encouraged the development, implementation and evaluation of policies related to mental health all over the world. In Colombia, within this framework and promoted by the Ministry of Health and Social Protection, as well as being supported by Colciencias, the fourth National Mental Health Survey (NMHST) was conducted using a observational cross sectional study. According to the context and following the guidelines and sampling design, a summary of the methodology used for this sampling process is presented. The fourth NMHST used the Homes Master Sample for Studies in Health from the National System of Studies and Population Surveys for Health to calculate its sample. This Master Sample was developed and implemented in the year 2013 by the Ministry of Social Protection. This study included non-institutionalised civilian population divided into four age groups: children 7-11 years, adolescent 12-17 years, 18-44 years and 44 years old or older. The sample size calculation was based on the reported prevalences in other studies for the outcomes of mental disorders, depression, suicide, associated morbidity, and alcohol use. A probabilistic, cluster, stratified and multistage selection process was used. Expansions factors to the total population were calculated. A total of 15,351 completed surveys were collected and were distributed according to the age groups: 2727, 7-11 years, 1754, 12-17 years, 5889, 18-44 years, and 4981, ≥45 years. All the surveys were distributed in five regions: Atlantic, Oriental, Bogotá, Central and Pacific. A sufficient number of surveys were collected in this study to obtain a more precise approximation of the mental problems and disorders at the regional and national level. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Scope of the preprocessing techniques was restricted to standard material from the EROS Data Center accompanied by some enlarging procedures and the use of the diazo process. Investigation has shown that the most appropriate sampling strategy for this study is the stratified random technique. A viable sampling procedure, together with a method for determining minimum number of sample points in order to test results of any interpretation are presented.
Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus
2016-05-11
Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.
Campos, Maria Doroteia; Valadas, Vera; Campos, Catarina; Morello, Laura; Braglia, Luca; Breviario, Diego; Cardoso, Hélia G
2018-01-01
Traceability of processed food and feed products has been gaining importance due to the impact that those products can have on human/animal health and to the associated economic and legal concerns, often related to adulterations and frauds as it can be the case for meat and milk. Despite mandatory traceability requirements for the analysis of feed composition, few reliable and accurate methods are presently available to enforce the legislative frame and allow the authentication of animal feeds. In this study, nine sensitive and species-specific real-time PCR TaqMan MGB assays are described for plant species detection in animal feed samples. The method is based on selective real-time qPCR (RT-qPCR) amplification of target genes belonging to the alternative oxidase (AOX) gene family. The plant species selected for detection in feed samples were wheat, maize, barley, soybean, rice and sunflower as common components of feeds, and cotton, flax and peanut as possible undesirable contaminants. The obtained results were compared with end-point PCR methodology. The applicability of the AOX TaqMan assays was evaluated through the screening of commercial feed samples, and by the analysis of plant mixtures with known composition. The RT-qPCR methodology allowed the detection of the most abundant species in feeds but also the identification of contaminant species present in lower amounts, down to 1% w/w. AOX-based methodology provides a suitable molecular marker approach to ascertain plant species composition of animal feed samples, thus supporting feed control and enforcement of the feed sector and animal production.
ERIC Educational Resources Information Center
Clayton, Berwyn; Meyers, Dave; Bateman, Andrea; Bluer, Robert
2010-01-01
This document supports the report "Practitioner Expectations and Experiences with the Certificate IV in Training and Assessment (TAA40104)". The first section outlines the methodology used to undertake the research and covers the design of the research, sample details, data collection processes and the strategy for data analysis and…
Measuring fuel moisture content in Alaska: standard methods and procedures.
Rodney A. Norum; Melanie Miller
1984-01-01
Methods and procedures are given for collecting and processing living and dead plant materials for the purpose of determining their water content. Wild-land fuels in Alaska are emphasized, but the methodology is applicable elsewhere. Guides are given for determining the number of samples needed to attain a chosen precision. Detailed procedures are presented for...
Going the Distance: Are There Common Factors in High Performance Distance Learning? Research Report.
ERIC Educational Resources Information Center
Hawksley, Rosemary; Owen, Jane
Common factors among high-performing distance learning (DL) programs were examined through case studies at 9 further education colleges and 2 nonsector organizations in the United Kingdom and a backup survey of a sample of 50 distance learners at 5 of the colleges. The study methodology incorporated numerous principles of process benchmarking. The…
ERIC Educational Resources Information Center
Lindstrom, Lauren; Doren, Bonnie; Miesch, Jennifer
2011-01-01
Youth with disabilities face many barriers in making the transition from high school to stable long-term employment. Researchers used case study methodology to examine the career development process and postschool employment outcomes for a sample of individuals with disabilities who were working in living wage occupations 7 to 10 years after…
The Relationship between Gender and Aspirations to Senior Management
ERIC Educational Resources Information Center
Litzky, Barrie; Greenhaus, Jeffrey
2007-01-01
Purpose: The purpose of this paper is to examine the relationship of gender, work factors, and non-work factors with aspirations to positions in senior management. A process model of senior management aspirations was developed and tested. Design/methodology/approach: Data were collected via an online survey that resulted in a sample of 368 working…
Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.
2015-01-01
Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126
Ventham, Nicholas T; Gardner, Richard A; Kennedy, Nicholas A; Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R; Fernandes, Daryl L; Satsangi, Jack; Spencer, Daniel I R
2015-01-01
Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures.
An ultrasonic methodology for muscle cross section measurement of support space flight
NASA Astrophysics Data System (ADS)
Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.
2004-09-01
The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal output of the simulation tool used as a methodology validation for actual tissue signals. The disturbance patterns of the simulated and sampled waveforms were consistent. Although only discussed as a small part of the work presented, the sampling portion also helped identify important considerations such as tissue compression and transducer positioning for future work involving tissue scanning with this methodology.
Zimmermann, Boris; Kohler, Achim
2014-01-01
Background It is imperative to have reliable and timely methodologies for analysis and monitoring of seed plants in order to determine climate-related plant processes. Moreover, impact of environment on plant fitness is predominantly based on studies of female functions, while the contribution of male gametophytes is mostly ignored due to missing data on pollen quality. We explored the use of infrared spectroscopy of pollen for an inexpensive and rapid characterization of plants. Methodology The study was based on measurement of pollen samples by two Fourier transform infrared techniques: single reflectance attenuated total reflectance and transmission measurement of sample pellets. The experimental set, with a total of 813 samples, included five pollination seasons and 300 different plant species belonging to all principal spermatophyte clades (conifers, monocotyledons, eudicots, and magnoliids). Results The spectroscopic-based methodology enables detection of phylogenetic variations, including the separation of confamiliar and congeneric species. Furthermore, the methodology enables measurement of phenotypic plasticity by the detection of inter-annual variations within the populations. The spectral differences related to environment and taxonomy are interpreted biochemically, specifically variations of pollen lipids, proteins, carbohydrates, and sporopollenins. The study shows large variations of absolute content of nutrients for congenital species pollinating in the same environmental conditions. Moreover, clear correlation between carbohydrate-to-protein ratio and pollination strategy has been detected. Infrared spectral database with respect to biochemical variation among the range of species, climate and biogeography will significantly improve comprehension of plant-environment interactions, including impact of global climate change on plant communities. PMID:24748390
GEOTHERMAL EFFLUENT SAMPLING WORKSHOP
This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.
2dFLenS and KiDS: determining source redshift distributions with cross-correlations
NASA Astrophysics Data System (ADS)
Johnson, Andrew; Blake, Chris; Amon, Alexandra; Erben, Thomas; Glazebrook, Karl; Harnois-Deraps, Joachim; Heymans, Catherine; Hildebrandt, Hendrik; Joudaki, Shahab; Klaes, Dominik; Kuijken, Konrad; Lidman, Chris; Marin, Felipe A.; McFarland, John; Morrison, Christopher B.; Parkinson, David; Poole, Gregory B.; Radovich, Mario; Wolf, Christian
2017-03-01
We develop a statistical estimator to infer the redshift probability distribution of a photometric sample of galaxies from its angular cross-correlation in redshift bins with an overlapping spectroscopic sample. This estimator is a minimum-variance weighted quadratic function of the data: a quadratic estimator. This extends and modifies the methodology presented by McQuinn & White. The derived source redshift distribution is degenerate with the source galaxy bias, which must be constrained via additional assumptions. We apply this estimator to constrain source galaxy redshift distributions in the Kilo-Degree imaging survey through cross-correlation with the spectroscopic 2-degree Field Lensing Survey, presenting results first as a binned step-wise distribution in the range z < 0.8, and then building a continuous distribution using a Gaussian process model. We demonstrate the robustness of our methodology using mock catalogues constructed from N-body simulations, and comparisons with other techniques for inferring the redshift distribution.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
Accuracy or precision: Implications of sample design and methodology on abundance estimation
Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.
2015-01-01
Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, B.G.J.; Grindrod, P.
Her Majesty`s Inspectorate of Polution (HMIP) of the United Kingdom has developed a procedure for the post closure assessment of the underground disposal of radioactive waste. In this paper the method of using theory and ideas from the mathematical sciences for assessment is described. The system simulation methodology seeks to discover key combinations of processes or effects which may yield behaviour of interest by sampling across functional and parametric uncertainties, and treating the systems within a probabilistic framework. This paper also discusses how HMIP assessment methodology has been presented, independent of any current application, for review by leading scientists whomore » are independent of the performance assessment field.« less
Evaluation of ridesharing programs in Michigan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulp, G.; Tsao, H.J.; Webber, R.E.
1982-10-01
The design, implementation, and results of a carpool and vanpool evaluation are described. Objectives of the evaluation were: to develop credible estimates of the energy savings attributable to the ridesharing program, to provide information for improving the performance of the ridesharing program, and to add to a general understanding of the ridesharing process. Previous evaluation work is critiqued and the research methodology adopted for this study is discussed. The ridesharing program in Michigan is described and the basis for selecting Michigan as the evaluation site is discussed. The evaluation methodology is presented, including research design, sampling procedure, data collection, andmore » data validation. Evaluation results are analyzed. (LEW)« less
Protein Aggregation Measurement through Electrical Impedance Spectroscopy
NASA Astrophysics Data System (ADS)
Affanni, A.; Corazza, A.; Esposito, G.; Fogolari, F.; Polano, M.
2013-09-01
The paper presents a novel methodology to measure the fibril formation in protein solutions. We designed a bench consisting of a sensor having interdigitated electrodes, a PDMS hermetic reservoir and an impedance meter automatically driven by calculator. The impedance data are interpolated with a lumped elements model and their change over time can provide information on the aggregation process. Encouraging results have been obtained by testing the methodology on K-casein, a protein of milk, with and without the addition of a drug inhibiting the aggregation. The amount of sample needed to perform this measurement is by far lower than the amount needed by fluorescence analysis.
Flexible sampling large-scale social networks by self-adjustable random walk
NASA Astrophysics Data System (ADS)
Xu, Xiao-Ke; Zhu, Jonathan J. H.
2016-12-01
Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.
A Validated Methodology for Genetic Identification of Tuna Species (Genus Thunnus)
Viñas, Jordi; Tudela, Sergi
2009-01-01
Background Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. Methodology After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Conclusions Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned. PMID:19898615
Qualitative case study methodology in nursing research: an integrative review.
Anthony, Susan; Jack, Susan
2009-06-01
This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.
Measuring e-Commerce service quality from online customer review using sentiment analysis
NASA Astrophysics Data System (ADS)
Kencana Sari, Puspita; Alamsyah, Andry; Wibowo, Sulistyo
2018-03-01
The biggest e-Commerce challenge to understand their market is to chart their level of service quality according to customer perception. The opportunities to collect user perception through online user review is considered faster methodology than conducting direct sampling methodology. To understand the service quality level, sentiment analysis methodology is used to classify the reviews into positive and negative sentiment for five dimensions of electronic service quality (e-Servqual). As case study in this research, we use Tokopedia, one of the biggest e-Commerce service in Indonesia. We obtain the online review comments about Tokopedia service quality during several month observations. The Naïve Bayes classification methodology is applied for the reason of its high-level accuracy and support large data processing. The result revealed that personalization and reliability dimension required more attention because have high negative sentiment. Meanwhile, trust and web design dimension have high positive sentiments that means it has very good services. The responsiveness dimension have balance sentiment positive and negative.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
Autoverification process improvement by Six Sigma approach: Clinical chemistry & immunoassay.
Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David
2018-05-01
This study examines effectiveness of a project to enhance an autoverification (AV) system through application of Six Sigma (DMAIC) process improvement strategies. Similar AV systems set up at three sites underwent examination and modification to produce improved systems while monitoring proportions of samples autoverified, the time required for manual review and verification, sample processing time, and examining characteristics of tests not autoverified. This information was used to identify areas for improvement and monitor the impact of changes. Use of reference range based criteria had the greatest impact on the proportion of tests autoverified. To improve AV process, reference range based criteria was replaced with extreme value limits based on a 99.5% test result interval, delta check criteria were broadened, and new specimen consistency rules were implemented. Decision guidance tools were also developed to assist staff using the AV system. The mean proportion of tests and samples autoverified improved from <62% for samples and <80% for tests, to >90% for samples and >95% for tests across all three sites. The new AV system significantly decreased turn-around time and total sample review time (to about a third), however, time spent for manual review of held samples almost tripled. There was no evidence of compromise to the quality of testing process and <1% of samples held for exceeding delta check or extreme limits required corrective action. The Six Sigma (DMAIC) process improvement methodology was successfully applied to AV systems resulting in an increase in overall test and sample AV by >90%, improved turn-around time, reduced time for manual verification, and with no obvious compromise to quality or error detection. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-27
... sampling method. Section 771(4)(A) of the Act defines the ``industry'' as the producers as a whole of a... the PRC. At this time, given the unique nature of the alleged subsidy and the complex methodological... process, such as aluminum products produced by a method of casting. Cast aluminum products are properly...
ERIC Educational Resources Information Center
Leisey, Sandra A.; Guinn, Nancy
At the request of the Air Force School of Aviation Medicine, a project was initiated to evaluate the current screening process used for entry into three medical technical training courses: Aeromedical Specialist, Environmental Health Specialist, and Physiological Training Specialist. A sample of 1,003 students were administered the General…
ERIC Educational Resources Information Center
Spaniol, Julia; Davidson, Patrick S. R.; Kim, Alice S. N.; Han, Hua; Moscovitch, Morris; Grady, Cheryl L.
2009-01-01
The recent surge in event-related fMRI studies of episodic memory has generated a wealth of information about the neural correlates of encoding and retrieval processes. However, interpretation of individual studies is hampered by methodological differences, and by the fact that sample sizes are typically small. We submitted results from studies of…
ERIC Educational Resources Information Center
Fusilier, Marcelline; Durlabhji, Subhash
2005-01-01
Purpose: The purpose of this paper is to explore behavioral processes involved in internet technology acceptance and use with a sample in India, a developing country that can potentially benefit from greater participation in the web economy. Design/methodology/approach - User experience was incorporated into the technology acceptance model (TAM)…
Quarterly Update: July-September 1990
1990-09-01
drafted a report that defines the process and the products of a feature oriented domain analysis ( FODA ) and provides a sample do- main analysis . This...receive ............... training to offer assessment services commercially. This section provides NASA adopted the rate monotonic scheduling analysis ...that the rate monotonic scheduling analysis approach will be the JulySeptember 1990 baseline methodology for its hard real-time operating system
Inferring Molecular Processes Heterogeneity from Transcriptional Data.
Gogolewski, Krzysztof; Wronowska, Weronika; Lech, Agnieszka; Lesyng, Bogdan; Gambin, Anna
2017-01-01
RNA microarrays and RNA-seq are nowadays standard technologies to study the transcriptional activity of cells. Most studies focus on tracking transcriptional changes caused by specific experimental conditions. Information referring to genes up- and downregulation is evaluated analyzing the behaviour of relatively large population of cells by averaging its properties. However, even assuming perfect sample homogeneity, different subpopulations of cells can exhibit diverse transcriptomic profiles, as they may follow different regulatory/signaling pathways. The purpose of this study is to provide a novel methodological scheme to account for possible internal, functional heterogeneity in homogeneous cell lines, including cancer ones. We propose a novel computational method to infer the proportion between subpopulations of cells that manifest various functional behaviour in a given sample. Our method was validated using two datasets from RNA microarray experiments. Both experiments aimed to examine cell viability in specific experimental conditions. The presented methodology can be easily extended to RNA-seq data as well as other molecular processes. Moreover, it complements standard tools to indicate most important networks from transcriptomic data and in particular could be useful in the analysis of cancer cell lines affected by biologically active compounds or drugs.
Inferring Molecular Processes Heterogeneity from Transcriptional Data
Wronowska, Weronika; Lesyng, Bogdan; Gambin, Anna
2017-01-01
RNA microarrays and RNA-seq are nowadays standard technologies to study the transcriptional activity of cells. Most studies focus on tracking transcriptional changes caused by specific experimental conditions. Information referring to genes up- and downregulation is evaluated analyzing the behaviour of relatively large population of cells by averaging its properties. However, even assuming perfect sample homogeneity, different subpopulations of cells can exhibit diverse transcriptomic profiles, as they may follow different regulatory/signaling pathways. The purpose of this study is to provide a novel methodological scheme to account for possible internal, functional heterogeneity in homogeneous cell lines, including cancer ones. We propose a novel computational method to infer the proportion between subpopulations of cells that manifest various functional behaviour in a given sample. Our method was validated using two datasets from RNA microarray experiments. Both experiments aimed to examine cell viability in specific experimental conditions. The presented methodology can be easily extended to RNA-seq data as well as other molecular processes. Moreover, it complements standard tools to indicate most important networks from transcriptomic data and in particular could be useful in the analysis of cancer cell lines affected by biologically active compounds or drugs. PMID:29362714
Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective
NASA Astrophysics Data System (ADS)
Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.
2017-12-01
The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.
77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...
Updated methodology for nuclear magnetic resonance characterization of shales
NASA Astrophysics Data System (ADS)
Washburn, Kathryn E.; Birdwell, Justin E.
2013-08-01
Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world's energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1-T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.
Updated methodology for nuclear magnetic resonance characterization of shales
Washburn, Kathryn E.; Birdwell, Justin E.
2013-01-01
Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world’s energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1–T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.
Plot-scale field experiment of surface hydrologic processes with EOS implications
NASA Technical Reports Server (NTRS)
Laymon, Charles A.; Macari, Emir J.; Costes, Nicholas C.
1992-01-01
Plot-scale hydrologic field studies were initiated at NASA Marshall Space Flight Center to a) investigate the spatial and temporal variability of surface and subsurface hydrologic processes, particularly as affected by vegetation, and b) develop experimental techniques and associated instrumentation methodology to study hydrologic processes at increasingly large spatial scales. About 150 instruments, most of which are remotely operated, have been installed at the field site to monitor ground atmospheric conditions, precipitation, interception, soil-water status, and energy flux. This paper describes the nature of the field experiment, instrumentation and sampling rationale, and presents preliminary findings.
Estarellas Martin, Carolina; Seira Castan, Constantí; Luque Garriga, F Javier; Bidon-Chanal Badia, Axel
2015-10-01
Residue conformational changes and internal cavity migration processes play a key role in regulating the kinetics of ligand migration and binding events in globins. Molecular dynamics simulations have demonstrated their value in the study of these processes in different haemoglobins, but derivation of kinetic data demands the use of more complex techniques like enhanced sampling molecular dynamics methods. This review discusses the different methodologies that are currently applied to study the ligand migration process in globins and highlight those specially developed to derive kinetic data. Copyright © 2015 Elsevier Ltd. All rights reserved.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
Measurement of testosterone in human sexuality research: methodological considerations.
van Anders, Sari M; Goldey, Katherine L; Bell, Sarah N
2014-02-01
Testosterone (T) and other androgens are incorporated into an increasingly wide array of human sexuality research, but there are a number of issues that can affect or confound research outcomes. This review addresses various methodological issues relevant to research design in human studies with T; unaddressed, these issues may introduce unwanted noise, error, or conceptual barriers to interpreting results. Topics covered are (1) social and demographic factors (gender and sex; sexual orientations and sexual diversity; social/familial connections and processes; social location variables), (2) biological rhythms (diurnal variation; seasonality; menstrual cycles; aging and menopause), (3) sample collection, handling, and storage (saliva vs. blood; sialogogues, saliva, and tubes; sampling frequency, timing, and context; shipping samples), (4) health, medical issues, and the body (hormonal contraceptives; medications and nicotine; health conditions and stress; body composition, weight, and exercise), and (5) incorporating multiple hormones. Detailing a comprehensive set of important issues and relevant empirical evidence, this review provides a starting point for best practices in human sexuality research with T and other androgens that may be especially useful for those new to hormone research.
Pérez-Palacios, T; Petisca, C; Melo, A; Ferreira, I M P L V O
2012-12-01
The validation of a method for the simultaneous quantification of furanic compounds in coated deep-fried samples processed and handled as usually consumed is presented. The deep-fried food was grinded using a device that simulates the mastication, and immediately analysed by headspace solid phase microextraction coupled to gas chromatography-mass spectrometry. Parameters affecting the efficiency of HS-SPME procedure were selected by response surface methodology, using a 2(3) full-factorial central composite design. Optimal conditions were achieved using 2g of sample, 3g of NaCl and 40min of absorption time at 37°C. Consistency between predicted and experimented values was observed and quality parameters of the method were established. As a result, furan, 2-furfural, furfuryl alcohol and 2-pentylfuran were, for the first time, simultaneously detected and quantified (5.59, 0.27, 10.48 and 1.77μgg(-1) sample, respectively) in coated deep-fried fish, contributing to a better understanding of the amounts of these compounds in food. Copyright © 2012 Elsevier Ltd. All rights reserved.
Juck, D F; Whissell, G; Steven, B; Pollard, W; McKay, C P; Greer, C W; Whyte, L G
2005-02-01
Fluorescent microspheres were applied in a novel fashion during subsurface drilling of permafrost and ground ice in the Canadian High Arctic to monitor the exogenous microbiological contamination of core samples obtained during the drilling process. Prior to each drill run, a concentrated fluorescent microsphere (0.5-microm diameter) solution was applied to the interior surfaces of the drill bit, core catcher, and core tube and allowed to dry. Macroscopic examination in the field demonstrated reliable transfer of the microspheres to core samples, while detailed microscopic examination revealed penetration levels of less than 1 cm from the core exterior. To monitor for microbial contamination during downstream processing of the permafrost and ground ice cores, a Pseudomonas strain expressing the green fluorescent protein (GFP) was painted on the core exterior prior to processing. Contamination of the processed core interiors with the GFP-expressing strain was not detected by culturing the samples or by PCR to detect the gfp marker gene. These methodologies were quick, were easy to apply, and should help to monitor the exogenous microbiological contamination of pristine permafrost and ground ice samples for downstream culture-dependent and culture-independent microbial analyses.
Juck, D. F.; Whissell, G.; Steven, B.; Pollard, W.; McKay, C. P.; Greer, C. W.; Whyte, L. G.
2005-01-01
Fluorescent microspheres were applied in a novel fashion during subsurface drilling of permafrost and ground ice in the Canadian High Arctic to monitor the exogenous microbiological contamination of core samples obtained during the drilling process. Prior to each drill run, a concentrated fluorescent microsphere (0.5-μm diameter) solution was applied to the interior surfaces of the drill bit, core catcher, and core tube and allowed to dry. Macroscopic examination in the field demonstrated reliable transfer of the microspheres to core samples, while detailed microscopic examination revealed penetration levels of less than 1 cm from the core exterior. To monitor for microbial contamination during downstream processing of the permafrost and ground ice cores, a Pseudomonas strain expressing the green fluorescent protein (GFP) was painted on the core exterior prior to processing. Contamination of the processed core interiors with the GFP-expressing strain was not detected by culturing the samples or by PCR to detect the gfp marker gene. These methodologies were quick, were easy to apply, and should help to monitor the exogenous microbiological contamination of pristine permafrost and ground ice samples for downstream culture-dependent and culture-independent microbial analyses. PMID:15691963
Sediment sampling and processing methods in Hungary, and possible improvements
NASA Astrophysics Data System (ADS)
Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy
2016-04-01
The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river, best organized in the upstream countries, where also on tributaries like the Drau/Drava monitoring stations are in operation. Sampling frequency of suspended load is 3 to 7 per year in Hungary, and even lower downstream. Sediment management is a major challenge, as most methods developed until now are unsustainable, require continuous intervention and are expensive as well. However, there is a new focus on the subject in the 21st century, which still lacks uniform methodological recommendations for measurements and analyses, and the number of engineers with sediment expertise and experience is alarmingly low. Data related to sediment quantity are unreliable and often contradictory. It is difficult to produce high quality long-term databases that could support and enable the mathematical calibration of sediment transport models. Sediment measurements are different in different countries in Europe. Even in Hungary, sampling and laboratory techniques have changed several times in the past. Also, sediment sampling was never really systhematic, and the sampling campaigns did not follow the hydrological processes. That is how sediment data can hardly be compared; and the data series are inhomogeneous and they cannot be statistically analysed. The majority of the existing sediment data in Hungary are not suitable for the data supply needs of state-of-the-art numerical modeling. It is even problematic to describe the connections between water flow (discharge) and sediment transport, because data are scarce and irregular. Even the most modern measurement methods (Acoustic Doppler Current Profiler [ADCP], or Laser In Situ Scattering and Transmissometry [LISST]) need calibration, which means field sampling and laboratory processing. For these reasons we need (both quantitatively and qualitively) appropriate sampling of sediment. In frame of projects and programs of the Institute for Hydraulic engineering and Water management of Eötvös József College, we developed the methodology of field-data collection campaigns in relation to sediment data in order to meet the calibration and verification needs of state-of-the art numerical modeling, and to be able to collect comparable data series for statistical analyses.
Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab
2014-08-25
We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
NASA Astrophysics Data System (ADS)
Kantiani, Lina; Farré, Marinella; Asperger, Danijela; Rubio, Fernando; González, Susana; López de Alda, Maria J.; Petrović, Mira; Shelver, Weilin L.; Barceló, Damià
2008-10-01
SummaryFor the first time, the occurrence of triclosan and its metabolite methyl-triclosan was investigated in a typical Mediterranean area using a two-step methodology based on screening using a magnetic particle immunoassay (IA) and confirmatory analysis by solid phase extraction (SPE) followed by gas chromatography-mass spectrometry (GC-MS). In this study, 95 environmental samples were analyzed. A commercial immunoassay was assessed for use in the different types of water selected for this study. A large monitoring study was performed on the influent and the effluent of eight wastewater treatment plants (WWTPs), water samples from Ebro and Llobregat rivers, and drinking water. All wastewater samples tested in this study (influents and effluents) showed the presence of triclosan, with concentrations for raw influents being high (10 μg/L as average value). The percentages of triclosan removal for the WWTPs were evaluated (30-70%) along the different treatment processes showing that the best removal rates were obtained by the processes equipped with membrane bioreactors (MBRs). However, important concentrations of triclosan were detected even after treatment by MBRs. The presence of this biocide was confirmed in 50% of the river samples analyzed. Twenty two drinking water samples from the Barcelona city area were investigated, and in this case no triclosan was detected. Due to its properties and the widespread usage of triclosan, there is a need for monitoring and controlling the amounts present in wastewater effluents, river water, drinking water catchments areas, and drinking water. To this end, we present a feasible methodology using a magnetic particle-based immunoassay as a screening, followed by confirmatory analysis using solid phase extraction-gas chromatography-mass spectrometry (SPE-GC-MS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.
We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms onmore » the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.« less
NASA Astrophysics Data System (ADS)
Kamalraj, Devaraj; Yuvaraj, Selvaraj; Yoganand, Coimbatore Paramasivam; Jaffer, Syed S.
2018-01-01
Here, we propose a new synthetic methodology for silver nanocluster preparation by using a double stranded-DNA (ds-DNA) template which no one has reported yet. A new calculative method was formulated to determine the size of the nanocluster and their band gaps by using steady state 3D contour fluorescence technique with Brus model. Generally, the structure and size of the nanoclusters determine by using High Resolution Transmission Electron Microscopy (HR-TEM). Before imaging the samples by using HR-TEM, they are introduced to drying process which causes aggregation and forms bigger polycrystalline particles. It takes long time duration and expensive methodology. In this current methodology, we found out the size and band gap of the nanocluster in the liquid form without any polycrystalline aggregation for which 3D contour fluorescence technique was used as an alternative approach to the HR-TEM method.
Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R
2012-05-17
Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Moreira, M. A.
1983-01-01
Using digitally processed MSS/LANDSAT data as auxiliary variable, a methodology to estimate wheat (Triticum aestivum L) area by means of sampling techniques was developed. To perform this research, aerial photographs covering 720 sq km in Cruz Alta test site at the NW of Rio Grande do Sul State, were visually analyzed. LANDSAT digital data were analyzed using non-supervised and supervised classification algorithms; as post-processing the classification was submitted to spatial filtering. To estimate wheat area, the regression estimation method was applied and different sample sizes and various sampling units (10, 20, 30, 40 and 60 sq km) were tested. Based on the four decision criteria established for this research, it was concluded that: (1) as the size of sampling units decreased the percentage of sampled area required to obtain similar estimation performance also decreased; (2) the lowest percentage of the area sampled for wheat estimation with relatively high precision and accuracy through regression estimation was 90% using 10 sq km s the sampling unit; and (3) wheat area estimation by direct expansion (using only aerial photographs) was less precise and accurate when compared to those obtained by means of regression estimation.
Identifying the starting point of a spreading process in complex networks.
Comin, Cesar Henrique; Costa, Luciano da Fontoura
2011-11-01
When dealing with the dissemination of epidemics, one important question that can be asked is the location where the contamination began. In this paper, we analyze three spreading schemes and propose and validate an effective methodology for the identification of the source nodes. The method is based on the calculation of the centrality of the nodes on the sampled network, expressed here by degree, betweenness, closeness, and eigenvector centrality. We show that the source node tends to have the highest measurement values. The potential of the methodology is illustrated with respect to three theoretical complex network models as well as a real-world network, the email network of the University Rovira i Virgili.
Yoshikawa, Hirokazu; Weisner, Thomas S; Kalil, Ariel; Way, Niobe
2008-03-01
Multiple methods are vital to understanding development as a dynamic, transactional process. This article focuses on the ways in which quantitative and qualitative methodologies can be combined to enrich developmental science and the study of human development, focusing on the practical questions of "when" and "how." Research situations that may be especially suited to mixing qualitative and quantitative approaches are described. The authors also discuss potential choices for using mixed quantitative- qualitative approaches in study design, sampling, construction of measures or interview protocols, collaborations, and data analysis relevant to developmental science. Finally, they discuss some common pitfalls that occur in mixing these methods and include suggestions for surmounting them.
Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling
Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.
2012-01-01
Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055
ERIC Educational Resources Information Center
Lordán, Eva; Solé, Isabel; Beltran, Francesc S.
2017-01-01
The aim of this research was to develop a new questionnaire for exploring the reading beliefs of undergraduate students, because the only currently available instrument has conceptual and methodological limitations. The paper describes the process of developing the instrument and presents a range of psychometric data obtained from a sample of 558…
Automatic Target Recognition Classification System Evaluation Methodology
2002-09-01
Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in
NASA Technical Reports Server (NTRS)
Eugenbrode, J.; Glavin, D.; Dworkin, J.; Conrad, P.; Mahaffy, P.
2011-01-01
Organic chemicals, when present in extraterrestrial samples, afford precious insight into past and modern conditions elsewhere in the Solar System . No single technology identifies all molecular components because naturally occurring molecules have different chemistries (e.g., polar vs. non-polar, low to high molecular weight) and interface with the ambient sample chemistry in a variety of modes (i.e., organics may be bonded, absorbed or trapped by minerals, liquids, gases, or other organics). More than 90% of organic matter in most natural samples on Earth and in meteorites is composed of complex macromolecules (e.g. biopolymers, complex biomolecules, humic substances, kerogen) because the processes that tend to break down organic molecules also tend towards complexation of the more recalcitrant components. Thus, methodologies that tap the molecular information contained within macromolecules may be critical to detecting extraterrestrial organic matter and assessing the sources and processes influencing its nature.
Haith-Cooper, Melanie
2003-01-01
The use of problem-based learning (PBL) in Health Professional curricula is becoming more wide spread. Although the way in which the tutor facilitates PBL can have a major impact on students' learning (Andrews and Jones 1996), the literature provides little consistency as to how the tutor can effectively facilitate PBL (Haith-Cooper 2000). It is therefore important to examine the facilitation role to promote effective learning through the use of PBL. This article is the first of two parts exploring a study that was undertaken to investigate tutors' experiences of facilitating PBL. This part focuses on the methodology and the combining of innovative processes with traditional philosophical traditions to develop a systematic educational research methodology. The study was undertaken respecting the philosophy of hermeneutic phenomenology but utilised alternative data collection and analysis technique. Video conferencing and e-mail were used in conjunction with more traditional processes to access a worldwide sample. This paper explores some of the issues that arose when undertaking such a study. The second article then focuses on exploring the findings of the study and their implications for the facilitation of PBL.
[Use of the Six Sigma methodology for the preparation of parenteral nutrition mixtures].
Silgado Bernal, M F; Basto Benítez, I; Ramírez García, G
2014-04-01
To use the tools of the Six Sigma methodology for the statistical control in the elaboration of parenteral nutrition mixtures at the critical checkpoint of specific density. Between August of 2010 and September of 2013, specific density analysis was performed to 100% of the samples, and the data were divided in two groups, adults and neonates. The percentage of acceptance, the trend graphs, and the sigma level were determined. A normality analysis was carried out by using the Shapiro Wilk test and the total percentage of mixtures within the specification limits was calculated. The specific density data between August of 2010 and September of 2013 comply with the normality test (W = 0.94) and show improvement in sigma level through time, reaching 6/6 in adults and 3.8/6 in neonates. 100% of the mixtures comply with the specification limits for adults and neonates, always within the control limits during the process. The improvement plans together with the Six Sigma methodology allow controlling the process, and warrant the agreement between the medical prescription and the content of the mixture. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Emery, Sherry; Lee, Jungwha; Curry, Susan J; Johnson, Tim; Sporer, Amy K; Mermelstein, Robin; Flay, Brian; Warnecke, Richard
2010-02-01
Surveys of community-based programs are difficult to conduct when there is virtually no information about the number or locations of the programs of interest. This article describes the methodology used by the Helping Young Smokers Quit (HYSQ) initiative to identify and profile community-based youth smoking cessation programs in the absence of a defined sample frame. We developed a two-stage sampling design, with counties as the first-stage probability sampling units. The second stage used snowball sampling to saturation, to identify individuals who administered youth smoking cessation programs across three economic sectors in each county. Multivariate analyses modeled the relationship between program screening, eligibility, and response rates and economic sector and stratification criteria. Cumulative logit models analyzed the relationship between the number of contacts in a county and the number of programs screened, eligible, or profiled in a county. The snowball process yielded 9,983 unique and traceable contacts. Urban and high-income counties yielded significantly more screened program administrators; urban counties produced significantly more eligible programs, but there was no significant association between the county characteristics and program response rate. There is a positive relationship between the number of informants initially located and the number of programs screened, eligible, and profiled in a county. Our strategy to identify youth tobacco cessation programs could be used to create a sample frame for other nonprofit organizations that are difficult to identify due to a lack of existing directories, lists, or other traditional sample frames.
Brandão, Marcelo L L; Almeida, Davi O; Bispo, Fernanda C P; Bricio, Silvia M L; Marin, Victor A; Miagostovich, Marize P
2014-05-01
This study aimed to assess the microbiological contamination of lettuces commercialized in Rio de Janeiro, Brazil, in order to investigate detection of norovirus genogroup II (NoV GII), Salmonella spp., total and fecal coliforms, such as Escherichia coli. For NoV detection samples were processed using the adsorption-elution concentration method associated to real-time quantitative polymerase chain reaction (qPCR). A total of 90 samples of lettuce including 30 whole fresh lettuces, 30 minimally processed (MP) lettuces, and 30 raw ready-to-eat (RTE) lettuce salads were randomly collected from different supermarkets (fresh and MP lettuce samples), food services, and self-service restaurants (RTE lettuce salads), all located in Rio de Janeiro, Brazil, from October 2010 to December 2011. NoV GII was not detected and PP7 bacteriophage used as internal control process (ICP) was recovered in 40.0%, 86.7%, and 76.7% of those samples, respectively. Salmonella spp. was not detected although fecal contamination has been observed by fecal coliform concentrations higher than 10(2) most probable number/g. E. coli was detected in 70.0%, 6.7%, and 30.0% of fresh, MP, and RTE samples, respectively. This study highlights the need to improve hygiene procedures at all stages of vegetable production and to show PP7 bacteriophage as an ICP for recovering RNA viruses' methods from MP and RTE lettuce samples, encouraging the evaluation of new protocols that facilitate the establishment of methodologies for NoV detection in a greater number of food microbiology laboratories. The PP7 bacteriophage can be used as an internal control process in methods for recovering RNA viruses from minimally processed and ready-to-eat lettuce samples. © 2014 Institute of Food Technologists®
Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.
Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R
1996-01-01
Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.
A validated methodology for genetic identification of tuna species (genus Thunnus).
Viñas, Jordi; Tudela, Sergi
2009-10-27
Tuna species of the genus Thunnus, such as the bluefin tunas, are some of the most important and yet most endangered trade fish in the world. Identification of these species in traded forms, however, may be difficult depending on the presentation of the products, which may hamper conservation efforts on trade control. In this paper, we validated a genetic methodology that can fully distinguish between the eight Thunnus species from any kind of processed tissue. After testing several genetic markers, a complete discrimination of the eight tuna species was achieved using Forensically Informative Nucleotide Sequencing based primarily on the sequence variability of the hypervariable genetic marker mitochondrial DNA control region (mtDNA CR), followed, in some specific cases, by a second validation by a nuclear marker rDNA first internal transcribed spacer (ITS1). This methodology was able to distinguish all tuna species, including those belonging to the subgenus Neothunnus that are very closely related, and in consequence can not be differentiated with other genetic markers of lower variability. This methodology also took into consideration the presence of introgression that has been reported in past studies between T. thynnus, T. orientalis and T. alalunga. Finally, we applied the methodology to cross-check the species identity of 26 processed tuna samples. Using the combination of two genetic markers, one mitochondrial and another nuclear, allows a full discrimination between all eight tuna species. Unexpectedly, the genetic marker traditionally used for DNA barcoding, cytochrome oxidase 1, could not differentiate all species, thus its use as a genetic marker for tuna species identification is questioned.
DEVELOPMENT OF A SUB-SLAB AIR SAMPLING PROTOCOL TO SUPPORT ASSESSMENT OF VAPOR INTRUSION
The primary purpose of this research effort is to develop a methodology for sub-slab sampling to support the EPA guidance and vapor intrusion investigations after vapor intrusion has been established at a site. Methodologies for sub-slab air sampling are currently lacking in ref...
Methodological Choices in Rating Speech Samples
ERIC Educational Resources Information Center
O'Brien, Mary Grantham
2016-01-01
Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…
Elhanan, Gai; Ochs, Christopher; Mejino, Jose L V; Liu, Hao; Mungall, Christopher J; Perl, Yehoshua
2017-06-01
To examine whether disjoint partial-area taxonomy, a semantically-based evaluation methodology that has been successfully tested in SNOMED CT, will perform with similar effectiveness on Uberon, an anatomical ontology that belongs to a structurally similar family of ontologies as SNOMED CT. A disjoint partial-area taxonomy was generated for Uberon. One hundred randomly selected test concepts that overlap between partial-areas were matched to a same size control sample of non-overlapping concepts. The samples were blindly inspected for non-critical issues and presumptive errors first by a general domain expert whose results were then confirmed or rejected by a highly experienced anatomical ontology domain expert. Reported issues were subsequently reviewed by Uberon's curators. Overlapping concepts in Uberon's disjoint partial-area taxonomy exhibited a significantly higher rate of all issues. Clear-cut presumptive errors trended similarly but did not reach statistical significance. A sub-analysis of overlapping concepts with three or more relationship types indicated a much higher rate of issues. Overlapping concepts from Uberon's disjoint abstraction network are quite likely (up to 28.9%) to exhibit issues. The results suggest that the methodology can transfer well between same family ontologies. Although Uberon exhibited relatively few overlapping concepts, the methodology can be combined with other semantic indicators to expand the process to other concepts within the ontology that will generate high yields of discovered issues. Copyright © 2017 Elsevier B.V. All rights reserved.
Alberdi-Cedeño, Jon; Ibargoitia, María L; Cristillo, Giovanna; Sopelana, Patricia; Guillén, María D
2017-04-15
The possibilities offered by a new methodology to determine minor components in edible oils are described. This is based on immersion of a solid-phase microextraction fiber of PDMS/DVB into the oil matrix, followed by Gas Chromatography/Mass Spectrometry. It enables characterization and differentiation of edible oils in a simple way, without either solvents or sample modification. This methodology allows simultaneous identification and quantification of sterols, tocols, hydrocarbons of different natures, fatty acids, esters, monoglycerides, fatty amides, aldehydes, ketones, alcohols, epoxides, furans, pyrans and terpenic oxygenated derivatives. The broad information provided by this methodology is useful for different areas of interest such as nutritional value, oxidative stability, technological performance, quality, processing, safety and even the prevention of fraudulent practices. Furthermore, for the first time, certain fatty amides, gamma- and delta-lactones of high molecular weight, and other aromatic compounds such as some esters derived from cinnamic acid have been detected in edible oils. Copyright © 2016 Elsevier Ltd. All rights reserved.
Elliston, Adam; Wood, Ian P; Soucouri, Marie J; Tantale, Rachelle J; Dicks, Jo; Roberts, Ian N; Waldron, Keith W
2015-01-01
High-throughput (HTP) screening is becoming an increasingly useful tool for collating biological data which would otherwise require the employment of excessive resources. Second generation biofuel production is one such process. HTP screening allows the investigation of large sample sets to be undertaken with increased speed and cost effectiveness. This paper outlines a methodology that will enable solid lignocellulosic substrates to be hydrolyzed and fermented at a 96-well plate scale, facilitating HTP screening of ethanol production, whilst maintaining repeatability similar to that achieved at a larger scale. The results showed that utilizing sheets of biomass of consistent density (handbills), for paper, and slurries of pretreated biomass that could be pipetted allowed standardized and accurate transfers to 96-well plates to be achieved (±3.1 and 1.7%, respectively). Processing these substrates by simultaneous saccharification and fermentation (SSF) at various volumes showed no significant difference on final ethanol yields, either at standard shake flask (200 mL), universal bottle (10 mL) or 96-well plate (1 mL) scales. Substrate concentrations of up to 10% (w/v) were trialed successfully for SSFs at 1 mL volume. The methodology was successfully tested by showing the effects of steam explosion pretreatment on both oilseed rape and wheat straws. This methodology could be used to replace large shake flask reactions with comparatively fast 96-well plate SSF assays allowing for HTP experimentation. Additionally this method is compatible with a number of standardized assay techniques such as simple colorimetric, High-performance liquid chromatography (HPLC) and Nuclear magnetic resonance (NMR) spectroscopy. Furthermore this research has practical uses in the biorefining of biomass substrates for second generation biofuels and novel biobased chemicals by allowing HTP SSF screening, which should allow selected samples to be scaled up or studied in more detail.
Enantiomer fractions of polychlorinated biphenyls in three selected Standard Reference Materials.
Morrissey, Joshua A; Bleackley, Derek S; Warner, Nicholas A; Wong, Charles S
2007-01-01
The enantiomer composition of six chiral polychlorinated biphenyls (PCBs) were measured in three different certified Standard Reference Materials (SRMs) from the US National Institute of Standards and Technology (NIST): SRM 1946 (Lake Superior fish tissue), SRM 1939a (PCB Congeners in Hudson River Sediment), and SRM 2978 (organic contaminants in mussel tissue--Raritan Bay, New Jersey) to aid in quality assurance/quality control methodologies in the study of chiral pollutants in sediments and biota. Enantiomer fractions (EFs) of PCBs 91, 95, 136, 149, 174, and 183 were measured using a suite of chiral columns by gas chromatography/mass spectrometry. Concentrations of target analytes were in agreement with certified values. Target analyte EFs in reference materials were measured precisely (<2% relative standard deviation), indicating the utility of SRM in quality assurance/control methodologies for analyses of chiral compounds in environmental samples. Measured EFs were also in agreement with previously published analyses of similar samples, indicating that similar enantioselective processes were taking place in these environmental matrices.
Arden, Sarah V; Pentimonti, Jill M; Cooray, Rochana; Jackson, Stephanie
2017-07-01
This investigation employs categorical content analysis processes as a mechanism to examine trends and issues in a sampling of highly cited (100+) literature in special education journals. The authors had two goals: (a) broadly identifying trends across publication type, content area, and methodology and (b) specifically identifying articles with disaggregated outcomes for students with learning disabilities (LD). Content analyses were conducted across highly cited (100+) articles published during a 20-year period (1992-2013) in a sample ( n = 3) of journals focused primarily on LD, and in one broad, cross-categorical journal recognized for its impact in the field. Results indicated trends in the article type (i.e., commentary and position papers), content (i.e., reading and behavior), and methodology (i.e., small proportions of experimental and quasi-experimental designs). Results also revealed stability in the proportion of intervention research studies when compared to previous analyses and a decline in the proportion of those that disaggregated data specifically for students with LD.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estevez, Ivan; Concept Scientific Instruments, ZA de Courtaboeuf, 2 rue de la Terre de Feu, 91940 Les Ulis; Chrétien, Pascal
2014-02-24
On the basis of a home-made nanoscale impedance measurement device associated with a commercial atomic force microscope, a specific operating process is proposed in order to improve absolute (in sense of “nonrelative”) capacitance imaging by drastically reducing the parasitic effects due to stray capacitance, surface topography, and sample tilt. The method, combining a two-pass image acquisition with the exploitation of approach curves, has been validated on sets of calibration samples consisting in square parallel plate capacitors for which theoretical capacitance values were numerically calculated.
Cappione, Amedeo; Mabuchi, Masaharu; Briggs, David; Nadler, Timothy
2015-04-01
Protein immuno-detection encompasses a broad range of analytical methodologies, including western blotting, flow cytometry, and microscope-based applications. These assays which detect, quantify, and/or localize expression for one or more proteins in complex biological samples, are reliant upon fluorescent or enzyme-tagged target-specific antibodies. While small molecule labeling kits are available with a range of detection moieties, the workflow is hampered by a requirement for multiple dialysis-based buffer exchange steps that are both time-consuming and subject to sample loss. In a previous study, we briefly described an alternative method for small-scale protein labeling with small molecule dyes whereby all phases of the conjugation workflow could be performed in a single centrifugal diafiltration device. Here, we expand on this foundational work addressing functionality of the device at each step in the workflow (sample cleanup, labeling, unbound dye removal, and buffer exchange/concentration) and the implications for optimizing labeling efficiency. When compared to other common buffer exchange methodologies, centrifugal diafiltration offered superior performance as measured by four key parameters (process time, desalting capacity, protein recovery, retain functional integrity). Originally designed for resin-based affinity purification, the device also provides a platform for up-front antibody purification or albumin carrier removal. Most significantly, by exploiting the rapid kinetics of NHS-based labeling reactions, the process of continuous diafiltration minimizes reaction time and long exposure to excess dye, guaranteeing maximal target labeling while limiting the risks associated with over-labeling. Overall, the device offers a simplified workflow with reduced processing time and hands-on requirements, without sacrificing labeling efficiency, final yield, or conjugate performance. Copyright © 2015 Elsevier B.V. All rights reserved.
da Silva Filho, Manoel; Santos, Daniel Valle Vasconcelos; Costa, Kauê Machado
2013-01-01
Analyzing cell morphology is crucial in the fields of cell biology and neuroscience. One of the main methods for evaluating cell morphology is by using intracellular fluorescent markers, including various commercially available dyes and genetically encoded fluorescent proteins. These markers can be used as free radical sources in photooxidation reactions, which in the presence of diaminobenzidine (DAB) forms an opaque and electron-dense precipitate that remains localized within the cellular and organelle membranes. This method confers many methodological advantages for the investigator, including absence of photo-bleaching, high visual contrast and the possibility of correlating optical imaging with electron microscopy. However, current photooxidation techniques require the continuous use of fluorescent or confocal microscopes, which wastes valuable mercury lamp lifetime and limits the conversion process to a few cells at a time. We developed a low cost optical apparatus for performing photooxidation reactions and propose a new procedure that solves these methodological restrictions. Our “photooxidizer” consists of a high power light emitting diode (LED) associated with a custom aluminum and acrylic case and a microchip-controlled current source. We demonstrate the efficacy of our method by converting intracellular DiI in samples of developing rat neocortex and post-mortem human retina. DiI crystals were inserted in the tissue and allowed to diffuse for 20 days. The samples were then processed with the new photooxidation technique and analyzed under optical microscopy. The results show that our protocols can unveil the fine morphology of neurons in detail. Cellular structures such as axons, dendrites and spine-like appendages were well defined. In addition to its low cost, simplicity and reliability, our method precludes the use of microscope lamps for photooxidation and allows the processing of many labeled cells simultaneously in relatively large tissue samples with high efficacy. PMID:23441199
Lewandowska, Aleksandra E; Macur, Katarzyna; Czaplewska, Paulina; Liss, Joanna; Łukaszuk, Krzysztof; Ołdziej, Stanisław
2017-08-04
Human follicular fluid (hFF) is a natural environment of oocyte maturation, and some components of hFF could be used to judge oocyte capability for fertilization and further development. In our pilot small-scale study three samples from four donors (12 samples in total) were analyzed to determine which hFF proteins/peptides could be used to differentiate individual oocytes and which are patient-specific. Ultrafiltration was used to fractionate hFF to high-molecular-weight (HMW) proteome (>10 kDa) and low-molecular-weight (LMW) peptidome (<10 kDa) fractions. HMW and LMW compositions were analyzed using LC-MS in SWATH data acquisition and processing methodology. In total we were able to identify 158 proteins, from which 59 were never reported before as hFF components. 55 (45 not reported before) proteins were found by analyzing LMW fraction, 67 (14 not reported before) were found by analyzing HMW fraction, and 36 were identified in both fractions of hFF. We were able to perform quantitative analysis for 72 proteins from HMW fraction of hFF. We found that concentrations of 11 proteins varied substantially among hFF samples from single donors, and those proteins are promising targets to identify biomarkers useful in oocyte quality assessment.
NASA Astrophysics Data System (ADS)
Li, Z.; Che, W.; Frey, H. C.; Lau, A. K. H.
2016-12-01
Portable air monitors are currently being developed and used to enable a move towards exposure monitoring as opposed to fixed site monitoring. Reliable methods are needed regarding capturing spatial and temporal variability in exposure concentration to obtain credible data from which to develop efficient exposure mitigation measures. However, there are few studies that quantify the validity and repeatability of the collected data. The objective of this study is to present and evaluate a collocated exposure monitoring (CEM) methodology including the calibration of portable air monitors against stationary reference equipment, side-by-side comparison of portable air monitors, personal or microenvironmental exposure monitoring and the processing and interpretation of the collected data. The CEM methodology was evaluated based on application to portable monitors TSI DustTrak II Aerosol Monitor 8530 for fine particulate matter (PM2.5) and TSI Q-Trak model 7575 with probe model 982 for CO, CO2, temperature and relative humidity. Taking a school sampling campaign in Hong Kong in January and June, 2015 as an example, the calibrated side-by-side measured 1 Hz PM2.5 concentrations showed good consistency between two sets of portable air monitors. Confidence in side-by-side comparison, PM2.5 concentrations of which most of the time were within 2 percent, enabled robust inference regarding differences when the monitors measured in classroom and pedestrian during school hour. The proposed CEM methodology can be widely applied in sampling campaigns with the objective of simultaneously characterizing pollutant concentrations in two or more locations or microenvironments. The further application of the CEM methodology to transportation exposure will be presented and discussed.
Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando
2013-10-01
Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.
Gonçalves, C; Alpendurada, M F
2005-03-15
In order to reduce the amount of sample to be collected and the time consumed in the analytical process, a broad range of analytes should be preferably considered in the same analytical procedure. A suitable methodology for pesticide residue analysis in soil samples was developed based on ultrasonic extraction (USE) and gas chromatography-mass spectrometry (GC-MS). For this study, different classes of pesticides were selected, both recent and old persistent molecules: parent compounds and degradation products, namely organochlorine, organophosphorous and pyrethroid insecticides, triazine and acetanilide herbicides and other miscellaneous pesticides. Pesticide residues could be detected in the low- to sub-ppb range (0.05-7.0mugkg(-1)) with good precision (7.5-20.5%, average 13.7% R.S.D.) and extraction efficiency (69-118%, average 88%) for the great majority of analytes. This methodology has been applied in a monitoring program of soil samples from an intensive horticulture area in Póvoa de Varzim, North of Portugal. The pesticides detected in four sampling programs (2001/2002) were the following: lindane, dieldrin, endosulfan, endosulfan sulfate, 4,4'-DDE, 4,4'-DDD, atrazine, desethylatrazine, alachlor, dimethoate, chlorpyrifos, pendimethalin, procymidone and chlorfenvinphos. Pesticide contamination was investigated at three depths and in different soil and crop types to assess the influence of soil characteristics and trends over time.
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2010 CFR
2010-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2012 CFR
2012-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2013 CFR
2013-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Code of Federal Regulations, 2014 CFR
2014-01-01
.... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...
Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis
2017-01-01
Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.
NASA Astrophysics Data System (ADS)
Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis
2017-01-01
Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.
Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina
2018-06-01
While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.
Sorting Olive Batches for the Milling Process Using Image Processing
Puerto, Daniel Aguilera; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan
2015-01-01
The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729
Guerra, J G; Rubiano, J G; Winter, G; Guerra, A G; Alonso, H; Arnedo, M A; Tejera, A; Gil, J M; Rodríguez, R; Martel, P; Bolivar, J P
2015-11-01
The determination in a sample of the activity concentration of a specific radionuclide by gamma spectrometry needs to know the full energy peak efficiency (FEPE) for the energy of interest. The difficulties related to the experimental calibration make it advisable to have alternative methods for FEPE determination, such as the simulation of the transport of photons in the crystal by the Monte Carlo method, which requires an accurate knowledge of the characteristics and geometry of the detector. The characterization process is mainly carried out by Canberra Industries Inc. using proprietary techniques and methodologies developed by that company. It is a costly procedure (due to shipping and to the cost of the process itself) and for some research laboratories an alternative in situ procedure can be very useful. The main goal of this paper is to find an alternative to this costly characterization process, by establishing a method for optimizing the parameters of characterizing the detector, through a computational procedure which could be reproduced at a standard research lab. This method consists in the determination of the detector geometric parameters by using Monte Carlo simulation in parallel with an optimization process, based on evolutionary algorithms, starting from a set of reference FEPEs determined experimentally or computationally. The proposed method has proven to be effective and simple to implement. It provides a set of characterization parameters which it has been successfully validated for different source-detector geometries, and also for a wide range of environmental samples and certified materials. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ciapała, Szymon; Adamski, Paweł
2015-01-01
Intensification of pedestrian tourism causes damage to trees near tourist tracks, and likewise changes the soil structure. As a result, one may expect that annual amount of trees growing near tracks is significantly lower than deeper in the forest. However, during the study of the long-term impact of tourism on the environment (determined from tree increment dynamics), some methodological problems may occur. It is particularly important in protected areas where law and administrative regulations related to nature conservation force research to be conducted using small samples. In this paper we have analyzed the data collected in the Polish part of the Tatra National Park in the two study plots divided into two zones each: the area directly under the influence of the tourist's trampling and the control group. The aim of such analyses was to present the potential effects of the factors which may affect the results of dendrochronological analysis: (i) small size of samples that affects their representativeness, (ii) spatial differences in the rates of the process, as a result of spatial variability of environmental factors and (iii) temporal differences in the rates of the process. This study confirms that the factors mentioned above could significantly influence the results and should be taken into consideration during the analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS... primary review utilizing probability sampling methodologies. Usually, the chosen methodology will be simple random sampling, but other probability samples may be utilized, when necessary and appropriate. (3...
Lunven, Catherine; Turpault, Sandrine; Beyer, Yann-Joel; O'Brien, Amy; Delfolie, Astrid; Boyanova, Neli; Sanderink, Ger-Jan; Baldinetti, Francesca
2016-01-01
Background: Teriflunomide, a once-daily oral immunomodulator approved for treatment of relapsing-remitting multiple sclerosis, is eliminated slowly from plasma. If necessary to rapidly lower plasma concentrations of teriflunomide, an accelerated elimination procedure using cholestyramine or activated charcoal may be used. The current bioanalytical assay for determination of plasma teriflunomide concentration requires laboratory facilities for blood centrifugation and plasma storage. An alternative method, with potential for greater convenience, is dried blood spot (DBS) methodology. Analytical and clinical validations are required to switch from plasma to DBS (finger-prick sampling) methodology. Methods: Using blood samples from healthy subjects, an LC-MS/MS assay method for quantification of teriflunomide in DBS over a range of 0.01–10 mcg/mL was developed and validated for specificity, selectivity, accuracy, precision, reproducibility, and stability. Results were compared with those from the current plasma assay for determination of plasma teriflunomide concentration. Results: Method was specific and selective relative to endogenous compounds, with process efficiency ∼88%, and no matrix effect. Inaccuracy and imprecision for intraday and interday analyses were <15% at all concentrations tested. Quantification of teriflunomide in DBS assay was not affected by blood deposit volume and punch position within spot, and hematocrit level had a limited but acceptable effect on measurement accuracy. Teriflunomide was stable for at least 4 months at room temperature, and for at least 24 hours at 37°C with and without 95% relative humidity, to cover sampling, drying, and shipment conditions in the field. The correlation between DBS and plasma concentrations (R2 = 0.97), with an average blood to plasma ratio of 0.59, was concentration independent and constant over time. Conclusions: DBS sampling is a simple and practical method for monitoring teriflunomide concentrations. PMID:27015245
NASA Astrophysics Data System (ADS)
Brodic, D.
2011-01-01
Text line segmentation represents the key element in the optical character recognition process. Hence, testing of text line segmentation algorithms has substantial relevance. All previously proposed testing methods deal mainly with text database as a template. They are used for testing as well as for the evaluation of the text segmentation algorithm. In this manuscript, methodology for the evaluation of the algorithm for text segmentation based on extended binary classification is proposed. It is established on the various multiline text samples linked with text segmentation. Their results are distributed according to binary classification. Final result is obtained by comparative analysis of cross linked data. At the end, its suitability for different types of scripts represents its main advantage.
Advances in bioanalytical techniques to measure steroid hormones in serum.
French, Deborah
2016-06-01
Steroid hormones are measured clinically to determine if a patient has a pathological process occurring in the adrenal gland, or other hormone responsive organs. They are very similar in structure making them analytically challenging to measure. Additionally, these hormones have vast concentration differences in human serum adding to the measurement complexity. GC-MS was the gold standard methodology used to measure steroid hormones clinically, followed by radioimmunoassay, but that was replaced by immunoassay due to ease of use. LC-MS/MS has now become a popular alternative owing to simplified sample preparation than for GC-MS and increased specificity and sensitivity over immunoassay. This review will discuss these methodologies and some new developments that could simplify and improve steroid hormone analysis in serum.
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
The impact of temporal sampling resolution on parameter inference for biological transport models.
Harrison, Jonathan U; Baker, Ruth E
2018-06-25
Imaging data has become an essential tool to explore key biological questions at various scales, for example the motile behaviour of bacteria or the transport of mRNA, and it has the potential to transform our understanding of important transport mechanisms. Often these imaging studies require us to compare biological species or mutants, and to do this we need to quantitatively characterise their behaviour. Mathematical models offer a quantitative description of a system that enables us to perform this comparison, but to relate mechanistic mathematical models to imaging data, we need to estimate their parameters. In this work we study how collecting data at different temporal resolutions impacts our ability to infer parameters of biological transport models; performing exact inference for simple velocity jump process models in a Bayesian framework. The question of how best to choose the frequency with which data is collected is prominent in a host of studies because the majority of imaging technologies place constraints on the frequency with which images can be taken, and the discrete nature of observations can introduce errors into parameter estimates. In this work, we mitigate such errors by formulating the velocity jump process model within a hidden states framework. This allows us to obtain estimates of the reorientation rate and noise amplitude for noisy observations of a simple velocity jump process. We demonstrate the sensitivity of these estimates to temporal variations in the sampling resolution and extent of measurement noise. We use our methodology to provide experimental guidelines for researchers aiming to characterise motile behaviour that can be described by a velocity jump process. In particular, we consider how experimental constraints resulting in a trade-off between temporal sampling resolution and observation noise may affect parameter estimates. Finally, we demonstrate the robustness of our methodology to model misspecification, and then apply our inference framework to a dataset that was generated with the aim of understanding the localization of RNA-protein complexes.
Kent, D J; Chauhan, K; Boor, K J; Wiedmann, M; Martin, N H
2016-07-01
United States dairy industry exports have steadily risen in importance over the last 10yr, with dairy powders playing a particularly critical role. Currently, approximately half of US-produced nonfat dry milk and skim milk powder is exported. Reaching new and expanding existing export markets relies in part on the control of endospore-forming bacteria in dairy powders. This study reports baseline mesophilic and thermophilic spore counts and spore populations from 55 raw material samples (primarily raw milk) and 33 dairy powder samples from dairy powder processors across the United States. Samples were evaluated using various spore testing methodologies and included initial heat treatments of (1) 80°C for 12 min; (2) 100°C for 30 min; and (3) 106°C for 30 min. Results indicate that significant differences in both the level and population of spores were found for both raw milk and dairy powders with the various testing methods. Additionally, on average, spore counts were not found to increase significantly from the beginning to the end of dairy powder processing, most likely related to the absence of biofilm formation by processing plant-associated sporeformers (e.g., Anoxybacillus sp.) in the facilities sampled. Finally, in agreement with other studies, Bacillus licheniformis was found to be the most prevalent sporeformer in both raw materials and dairy powders, highlighting the importance of this organism in developing strategies for control and reduction of spore counts in dairy powders. Overall, this study emphasizes the need for standardization of spore enumeration methodologies in the dairy powder industry. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
How much is enough? An analysis of CD measurement amount for mask characterization
NASA Astrophysics Data System (ADS)
Ullrich, Albrecht; Richter, Jan
2009-10-01
The demands on CD (critical dimension) metrology amount in terms of both reproducibility and measurement uncertainty steadily increase from node to node. Different mask characterization requirements have to be addressed like very small features, unevenly distributed features, contacts, semi-dense structures to name only a few. Usually this enhanced need is met by an increasing number of CD measurements, where the new CD requirements are added to the well established CD characterization recipe. This leads straight forwardly to prolonged cycle times and highly complex evaluation routines. At the same time mask processes are continuously improved to become more stable. The enhanced stability offers potential to actually reduce the number of measurements. Thus, in this work we will start to address the fundamental question of how many CD measurements are needed for mask characterization for a given confidence level. We used analysis of variances (ANOVA) to distinguish various contributors like mask making process, measurement tool stability and measurement methodology. These contributions have been investigated for classical photomask CD specifications e.g. mean to target, CD uniformity, target offset tolerance and x-y bias. We found depending on specification that the importance of the contributors interchanges. Interestingly, not only short and long-term metrology contributions are dominant. Also the number of measurements and their spatial distribution on the mask layout (sampling methodology) can be the most important part of the variance. The knowledge of contributions can be used to optimize the sampling plan. As a major finding, we conclude that there is potential to reduce a significant amount of measurements without loosing confidence at all. Here, full sampling in x and y as well as full sampling for different features can be shortened substantially almost up to 50%.
Why minimally invasive skin sampling techniques? A bright scientific future.
Wang, Christina Y; Maibach, Howard I
2011-03-01
There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.
Alvarez, D.A.; Stackelberg, P.E.; Petty, J.D.; Huckins, J.N.; Furlong, E.T.; Zaugg, S.D.; Meyer, M.T.
2005-01-01
Four water samples collected using standard depth and width water-column sampling methodology were compared to an innovative passive, in situ, sampler (the polar organic chemical integrative sampler or POCIS) for the detection of 96 organic wastewater-related contaminants (OWCs) in a stream that receives agricultural, municipal, and industrial wastewaters. Thirty-two OWCs were identified in POCIS extracts whereas 9-24 were identified in individual water-column samples demonstrating the utility of POCIS for identifying contaminants whose occurrence are transient or whose concentrations are below routine analytical detection limits. Overall, 10 OWCs were identified exclusively in the POCIS extracts and only six solely identified in the water-column samples, however, repetitive water samples taken using the standard method during the POCIS deployment period required multiple trips to the sampling site and an increased number of samples to store, process, and analyze. Due to the greater number of OWCs detected in the POCIS extracts as compared to individual water-column samples, the ease of performing a single deployment as compared to collecting and processing multiple water samples, the greater mass of chemical residues sequestered, and the ability to detect chemicals which dissipate quickly, the passive sampling technique offers an efficient and effective alternative for detecting OWCs in our waterways for wastewater contaminants.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
Hayashi, Shigehiko; Uchida, Yoshihiro; Hasegawa, Taisuke; Higashi, Masahiro; Kosugi, Takahiro; Kamiya, Motoshi
2017-05-05
Many remarkable molecular functions of proteins use their characteristic global and slow conformational dynamics through coupling of local chemical states in reaction centers with global conformational changes of proteins. To theoretically examine the functional processes of proteins in atomic detail, a methodology of quantum mechanical/molecular mechanical (QM/MM) free-energy geometry optimization is introduced. In the methodology, a geometry optimization of a local reaction center is performed with a quantum mechanical calculation on a free-energy surface constructed with conformational samples of the surrounding protein environment obtained by a molecular dynamics simulation with a molecular mechanics force field. Geometry optimizations on extensive free-energy surfaces by a QM/MM reweighting free-energy self-consistent field method designed to be variationally consistent and computationally efficient have enabled examinations of the multiscale molecular coupling of local chemical states with global protein conformational changes in functional processes and analysis and design of protein mutants with novel functional properties.
Photothermal heating as a methodology for post processing of polymeric nanofibers
NASA Astrophysics Data System (ADS)
Gorga, Russell; Clarke, Laura; Bochinski, Jason; Viswanath, Vidya; Maity, Somsubhra; Dong, Ju; Firestone, Gabriel
2015-03-01
Metal nanoparticles embedded within polymeric systems can be made to act as localized heat sources thereby aiding in-situ polymer processing. This is made possible by the surface plasmon resonance (SPR) mediated photothermal effect of metal (in this case gold) nanoparticles, wherein incident light absorbed by the nanoparticle generates a non-equilibrium electron distribution which subsequently transfers this energy into the surrounding medium, resulting in a temperature increase in the immediate region around the particle. Here we demonstrate this effect in polymer nanocomposite systems, specifically electrospun polyethylene oxide nanofibrous mats, which have been annealed at temperatures above the glass transition. A non-contact temperature measurement technique utilizing embedded fluorophores (perylene) has been used to monitor the average temperature within samples. The effect of annealing methods (conventional and photothermal) and annealing conditions (temperature and time) on the fiber morphology, overall crystallinity, and mechanical properties is discussed. This methodology is further utilized in core-sheath nanofibers to crosslink the core material, which is a pre-cured epoxy thermoset. NSF Grant CMMI-1069108.
NASA Astrophysics Data System (ADS)
Hayashi, Shigehiko; Uchida, Yoshihiro; Hasegawa, Taisuke; Higashi, Masahiro; Kosugi, Takahiro; Kamiya, Motoshi
2017-05-01
Many remarkable molecular functions of proteins use their characteristic global and slow conformational dynamics through coupling of local chemical states in reaction centers with global conformational changes of proteins. To theoretically examine the functional processes of proteins in atomic detail, a methodology of quantum mechanical/molecular mechanical (QM/MM) free-energy geometry optimization is introduced. In the methodology, a geometry optimization of a local reaction center is performed with a quantum mechanical calculation on a free-energy surface constructed with conformational samples of the surrounding protein environment obtained by a molecular dynamics simulation with a molecular mechanics force field. Geometry optimizations on extensive free-energy surfaces by a QM/MM reweighting free-energy self-consistent field method designed to be variationally consistent and computationally efficient have enabled examinations of the multiscale molecular coupling of local chemical states with global protein conformational changes in functional processes and analysis and design of protein mutants with novel functional properties.
Cerqueira, Marcos Rodrigues Facchini; Grasseschi, Daniel; Matos, Renato Camargo; Angnes, Lucio
2014-08-01
Different materials like glass, silicon and poly(methyl methacrylate) (PMMA) are being used to immobilise enzymes in microchannels. PMMA shows advantages such as its low price, biocompatibility and attractive mechanical and chemical properties. Despite this, the introduction of reactive functional groups on PMMA is still problematic, either because of the complex chemistry or extended reaction time involved. In this paper, a new methodology was developed to immobilise glucose oxidase (GOx) in PMMA microchannels, with the benefit of a rapid immobilisation process and a very simple route. The new procedure involves only two steps, based on the reaction of 5.0% (w/w) polyethyleneimine (PEI) with PMMA in a dimethyl sulphoxide medium, followed by the immobilisation of glucose oxidase using a solution containing 100U enzymes and 1.0% (v/v) glutaraldehyde. The reactors prepared in this way were evaluated by a flowing system with amperometric detection (+0.60V) based on the oxidation of the H2O2 produced by the reactor. The microreactor proposed here was able to work with high bioconversion and a frequency of 60 samples h(-1), with detection and quantification limits of 0.50 and 1.66µmol L(-1), respectively. Michaelis-Menten parameters (Vmax and KM) were calculated as 449±47.7nmol min(-1) and 7.79±0.98mmol. Statistical evaluations were done to validate the proposed methodology. The content of glucose in natural and commercial coconut water samples was evaluated using the developed method. Comparison with spectrophotometric measurements showed that both methodologies have a very good correlation (tcalculated, 0.05, 4=1.35
Focus: a robust workflow for one-dimensional NMR spectral analysis.
Alonso, Arnald; Rodríguez, Miguel A; Vinaixa, Maria; Tortosa, Raül; Correig, Xavier; Julià, Antonio; Marsal, Sara
2014-01-21
One-dimensional (1)H NMR represents one of the most commonly used analytical techniques in metabolomic studies. The increase in the number of samples analyzed as well as the technical improvements involving instrumentation and spectral acquisition demand increasingly accurate and efficient high-throughput data processing workflows. We present FOCUS, an integrated and innovative methodology that provides a complete data analysis workflow for one-dimensional NMR-based metabolomics. This tool will allow users to easily obtain a NMR peak feature matrix ready for chemometric analysis as well as metabolite identification scores for each peak that greatly simplify the biological interpretation of the results. The algorithm development has been focused on solving the critical difficulties that appear at each data processing step and that can dramatically affect the quality of the results. As well as method integration, simplicity has been one of the main objectives in FOCUS development, requiring very little user input to perform accurate peak alignment, peak picking, and metabolite identification. The new spectral alignment algorithm, RUNAS, allows peak alignment with no need of a reference spectrum, and therefore, it reduces the bias introduced by other alignment approaches. Spectral alignment has been tested against previous methodologies obtaining substantial improvements in the case of moderate or highly unaligned spectra. Metabolite identification has also been significantly improved, using the positional and correlation peak patterns in contrast to a reference metabolite panel. Furthermore, the complete workflow has been tested using NMR data sets from 60 human urine samples and 120 aqueous liver extracts, reaching a successful identification of 42 metabolites from the two data sets. The open-source software implementation of this methodology is available at http://www.urr.cat/FOCUS.
Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.
Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E
2017-05-01
Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty component due to spatial variability was evaluated to about 19% for the 137 Cs inventory and up to 34% for the 137 Cs penetration depth. Based on the presented results and also on related literature, it is argued that such high uncertainties should be anticipated for single core samplings conducted using similar methodology and employed as 137 Cs inventory and penetration depth estimators. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hafizzal, Y.; Nurulhuda, A.; Izman, S.; Khadir, AZA
2017-08-01
POM-copolymer bond breaking leads to change depending with respect to processing methodology and material geometries. This paper present the oversights effect on the material integrity due to different geometries and processing methodology. Thermo-analytical methods with reference were used to examine the degradation of thermomechanical while Thermogravimetric Analysis (TGA) was used to judge the thermal stability of sample from its major decomposition temperature. Differential Scanning Calorimetry (DSC) investigation performed to identify the thermal behaviour and thermal properties of materials. The result shown that plastic gear geometries with injection molding at higher tonnage machine more stable thermally rather than resin geometries. Injection plastic gear geometries at low tonnage machine faced major decomposition temperatures at 313.61°C, 305.76 °C and 307.91 °C while higher tonnage processing method are fully decomposed at 890°C, significantly higher compared to low tonnage condition and resin geometries specimen at 398°C. Chemical composition of plastic gear geometries with injection molding at higher and lower tonnage are compare based on their moisture and Volatile Organic Compound (VOC) content, polymeric material content and the absence of filler. Results of higher moisture and Volatile Organic Compound (VOC) content are report in resin geometries (0.120%) compared to higher tonnage of injection plastic gear geometries which is 1.264%. The higher tonnage of injection plastic gear geometry are less sensitive to thermo-mechanical degradation due to polymer chain length and molecular weight of material properties such as tensile strength, flexural strength, fatigue strength and creep resistance.
Interdiscplinary team processes within an in-home service delivery organization.
Gantert, Thomas W; McWilliam, Carol L
2004-01-01
Interdisciplinary teamwork is particularly difficult to achieve in the community context where geographical separateness and solo practices impede face to face contact and collaborative practice. Understanding the processes that occur within interdisciplinary teams is imperative, since client outcomes are influenced by interdisciplinary teamwork. The purpose of this exploratory study was to describe the processes that occur within interdisciplinary teams that deliver in-home care. Applying grounded theory methodology, the researcher conducted unstructured in-depth interviews with a purposeful sample of healthcare providers and used constant comparative analysis to elicit the findings. Findings revealed three key team processes: networking, navigating, and aligning. The descriptions afford several insights that are applicable to in-home healthcare agencies attempting to achieve effective interdisciplinary team functioning.
Processing and Characterization of Porous Ti2AlC with Controlled Porosity and Pore Size
2012-09-11
fabricated by spark plasma sintering , were also characterized. The effects of porosity and/or pore size on the room temperature elastic moduli...pressureless- sintered without NaCl pore former, or fabricated by spark plasma sintering , were also characterized. The effects of porosity and/or pore size...as well as several samples sintered using spark plasma sintering (SPS). Furthermore, we demon- strate that the developed methodology can be implemented
Measuring Substance Use and Misuse via Survey Research: Unfinished Business.
Johnson, Timothy P
2015-01-01
This article reviews unfinished business regarding the assessment of substance use behaviors by using survey research methodologies, a practice that dates back to the earliest years of this journal's publication. Six classes of unfinished business are considered including errors of sampling, coverage, non-response, measurement, processing, and ethics. It may be that there is more now that we do not know than when this work began some 50 years ago.
A call to improve sampling methodology and reporting in young novice driver research.
Scott-Parker, B; Senserrick, T
2017-02-01
Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Ramírez-Silva, Ivonne; Jiménez-Aguilar, Alejandra; Valenzuela-Bravo, Danae; Martinez-Tapia, Brenda; Rodríguez-Ramírez, Sonia; Gaona-Pineda, Elsa Berenice; Angulo-Estrada, Salomón; Shamah-Levy, Teresa
2016-01-01
To describe the methodology used to clean up and estimate dietary intake (DI) data from the Semi-Quantitative Food Frequency Questionnaire (SFFQ) of the Mexican National Health and Nutrition Survey 2012. DI was collected through a shortterm SFFQ regarding 140 foods (from October 2011 to May 2012). Energy and nutrient intake was calculated according to a nutrient database constructed specifically for the SFFQ. A total of 133 nutrients including energy and fiber were generated from SFFQ data. Between 4.8 and 9.6% of the survey sample was excluded as a result of the cleaning process.Valid DI data were obtained regarding energy and nutrients consumed by 1 212 pre-school children, 1 323 school children, 1 961 adolescents, 2 027 adults and 526 older adults. We documented the methodology used to clean up and estimate DI from the SFFQ used in national dietary assessments in Mexico.
Effect of Temperature, Time, and Material Thickness on the Dehydration Process of Tomato
Correia, A. F. K.; Loro, A. C.; Zanatta, S.; Spoto, M. H. F.; Vieira, T. M. F. S.
2015-01-01
This study aimed to evaluate the effects of temperature, time, and thickness of tomatoes fruits during adiabatic drying process. Dehydration, a simple and inexpensive process compared to other conservation methods, is widely used in the food industry in order to ensure a long shelf life for the product due to the low water activity. This study aimed to obtain the best processing conditions to avoid losses and keep product quality. Factorial design and surface response methodology were applied to fit predictive mathematical models. In the dehydration of tomatoes through the adiabatic process, temperature, time, and sample thickness, which greatly contribute to the physicochemical and sensory characteristics of the final product, were evaluated. The optimum drying conditions were 60°C with the lowest thickness level and shorter time. PMID:26904666
Mihanović, Frane; Jerković, Ivan; Kružić, Ivana; Anđelinović, Šimun; Janković, Stipan; Bašić, Željana
2017-09-01
In the identification process of historical figures, and especially in cases of Saint's bodies or mummified remains, any method that includes physical encroachment or sampling is often not allowed. In these cases, one of the few remaining possibilities is the application of nondestructive radiographical and anthropological methods. However, although there have been a few attempts of such analyses, no systematic standard methodology has been developed until now. In this study, we developed a methodological approach that was used to test the authenticity of the alleged body of Saint Paul the Confessor. Upon imaging the remains on MSCT and post-processing, the images were analyzed by an interdisciplinary team to explore the contents beneath the binding media (e.g., the remains) and to obtain osetobiographical data for comparison with historical biological data. Obtained results: ancestry, sex, age, occupation, and social status were consistent with historical data. Although the methodological approach proved to be appropriate in this case, due to the discrepancy in the amount of data, identity could not be fully confirmed. Nonetheless, the hypothesis that the remains do not belong to St. Paul was rejected, whilst positive identification receives support. Anat Rec, 300:1535-1546, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Novel methodology to isolate microplastics from vegetal-rich samples.
Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T
2018-04-01
Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.
Automation in high-content flow cytometry screening.
Naumann, U; Wand, M P
2009-09-01
High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.
Microstructures and Mechanical Properties of Two-Phase Alloys Based on NbCr(2)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cady, C.M.; Chen, K.C.; Kotula, P.G.
A two-phase, Nb-Cr-Ti alloy (bee+ C15 Laves phase) has been developed using several alloy design methodologies. In effort to understand processing-microstructure-property relationships, diffment processing routes were employed. The resulting microstructure and mechanical properties are discussed and compared. Plasma arc-melted samples served to establish baseline, . . . as-cast properties. In addition, a novel processing technique, involving decomposition of a supersaturated and metastable precursor phase during hot isostatic pressing (HIP), was used to produce a refined, equilibrium two-phase microstructure. Quasi-static compression tests as a ~ function of temperature were performed on both alloy types. Different deformation mechanisms were encountered based uponmore » temperature and microstructure.« less
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
Grisales, Jaiver Osorio; Arancibia, Juan A; Castells, Cecilia B; Olivieri, Alejandro C
2012-12-01
In this report, we demonstrate how chiral liquid chromatography combined with multivariate chemometric techniques, specifically unfolded-partial least-squares regression (U-PLS), provides a powerful analytical methodology. Using U-PLS, strongly overlapped enantiomer profiles in a sample could be successfully processed and enantiomeric purity could be accurately determined without requiring baseline enantioresolution between peaks. The samples were partially enantioseparated with a permethyl-β-cyclodextrin chiral column under reversed-phase conditions. Signals detected with a diode-array detector within a wavelength range from 198 to 241 nm were recorded, and the data were processed by a second-order multivariate algorithm to decrease detection limits. The R-(-)-enantiomer of ibuprofen in tablet formulation samples could be determined at the level of 0.5 mg L⁻¹ in the presence of 99.9% of the S-(+)-enantiomorph with relative prediction error within ±3%. Copyright © 2012 Elsevier B.V. All rights reserved.
Bonadio, Federica; Margot, Pierre; Delémont, Olivier; Esseiva, Pierre
2008-11-20
Headspace solid-phase microextraction (HS-SPME) is assessed as an alternative to liquid-liquid extraction (LLE) currently used for 3,4-methylenedioxymethampethamine (MDMA) profiling. Both methods were compared evaluating their performance in discriminating and classifying samples. For this purpose 62 different seizures were analysed using both extraction techniques followed by gas chromatography-mass spectroscopy (GC-MS). A previously validated method provided data for HS-SPME, whereas LLE data were collected applying a harmonized methodology developed and used in the European project CHAMP. After suitable pre-treatment, similarities between sample pairs were studied using the Pearson correlation. Both methods enable to distinguish between samples coming from the same pre-tabletting batches and samples coming from different pre-tabletting batches. This finding emphasizes the use of HS-SPME as an effective alternative to LLE, with additional advantages such as sample preparation and a solvent-free process.
Manies, Kristen L.; Harden, Jennifer W.; Holingsworth, Teresa N.
2014-01-01
This report describes the collection and processing methodologies for samples obtained at two sites within Interior Alaska: (1) a location within the 2001 Survey Line burn, and (2) an unburned location, selected as a control. In 2002 and 2004 U.S. Geological Survey investigators measured soil properties including, but not limited to, bulk density, volumetric water content, carbon content, and nitrogen content from samples obtained from these sites. Stand properties, such as tree density, the amount of woody debris, and understory vegetation, were also measured and are presented in this report.
Numerical study of the process parameters in spark plasma sintering (sps)
NASA Astrophysics Data System (ADS)
Chowdhury, Redwan Jahid
Spark plasma sintering (SPS) is one of the most widely used sintering techniques that utilizes pulsed direct current together with uniaxial pressure to consolidate a wide variety of materials. The unique mechanisms of SPS enable it to sinter powder compacts at a lower temperature and in a shorter time than the conventional hot pressing, hot isostatic pressing and vacuum sintering process. One of the limitations of SPS is the presence of temperature gradients inside the sample, which could result in non-uniform physical and microstructural properties. Detailed study of the temperature and current distributions inside the sintered sample is necessary to minimize the temperature gradients and achieve desired properties. In the present study, a coupled thermal-electric model was developed using finite element codes in ABAQUS software to investigate the temperature and current distributions inside the conductive and non-conductive samples. An integrated experimental-numerical methodology was implemented to determine the system contact resistances accurately. The developed sintering model was validated by a series of experiments, which showed good agreements with simulation results. The temperature distribution inside the sample depends on some process parameters such as sample and tool geometry, punch and die position, applied current and thermal insulation around the die. The role of these parameters on sample temperature distribution was systematically analyzed. The findings of this research could prove very useful for the reliable production of large size sintered samples with controlled and tailored properties.
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.
Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method
Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744
Chandarana, Keval; Drew, Megan E; Emmanuel, Julian; Karra, Efthimia; Gelegen, Cigdem; Chan, Philip; Cron, Nicholas J; Batterham, Rachel L
2009-06-01
Gut hormones represent attractive therapeutic targets for the treatment of obesity and type 2 diabetes. However, controversy surrounds the effects that adiposity, dietary manipulations, and bariatric surgery have on their circulating concentrations. We sought to determine whether these discrepancies are due to methodologic differences. Ten normal-weight males participated in a 4-way crossover study investigating whether fasting appetite scores, plasma acyl-ghrelin, active glucagon-like peptide-1 (GLP-1), and peptide YY3-36 (PYY3-36) levels are altered by study-induced stress, prior food consumption, and sample processing. Study visit order affected anxiety, plasma cortisol, and temporal profiles of appetite and plasma PYY3-36, with increased anxiety and cortisol concentrations on the first study day. Plasma cortisol area under the curve (AUC) correlated positively with plasma PYY3-36 AUC. Despite a 14-hour fast, baseline hunger, PYY3-36 concentrations, temporal appetite profiles, PYY3-36 AUC, and active GLP-1 were affected by the previous evening's meal. Sample processing studies revealed that sample acidification and esterase inhibition are required when measuring acyl-ghrelin and dipeptidyl-peptidase IV inhibitor addition for active GLP-1. However, plasma PYY3-36 concentrations were unaffected by addition of dipeptidyl-peptidase IV. Accurate assessment of appetite, feeding behavior, and gut hormone concentrations requires standardization of prior food consumption and subject acclimatization to the study protocol. Moreover, because of the labile nature of acyl-ghrelin and active GLP-1, specialized sample processing needs to be undertaken.
Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.
2010-01-01
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment. PMID:21206748
Sheehan, Emma V; Stevens, Timothy F; Attrill, Martin J
2010-12-29
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Intelligibility assessment in developmental phonological disorders: accuracy of caregiver gloss.
Kwiatkowski, J; Shriberg, L D
1992-10-01
Fifteen caregivers each glossed a simultaneously videotaped and audiotaped sample of their child with speech delay engaged in conversation with a clinician. One of the authors generated a reference gloss for each sample, aided by (a) prior knowledge of the child's speech-language status and error patterns, (b) glosses from the child's clinician and the child's caregiver, (c) unlimited replays of the taped sample, and (d) the information gained from completing a narrow phonetic transcription of the sample. Caregivers glossed an average of 78% of the utterances and 81% of the words. A comparison of their glosses to the reference glosses suggested that they accurately understood an average of 58% of the utterances and 73% of the words. Discussion considers the implications of such findings for methodological and theoretical issues underlying children's moment-to-moment intelligibility breakdowns during speech-language processing.
Xiao, Yongli; Sheng, Zong-Mei; Taubenberger, Jeffery K.
2015-01-01
The vast majority of surgical biopsy and post-mortem tissue samples are formalin-fixed and paraffin-embedded (FFPE), but this process leads to RNA degradation that limits gene expression analysis. As an example, the viral RNA genome of the 1918 pandemic influenza A virus was previously determined in a 9-year effort by overlapping RT-PCR from post-mortem samples. Using the protocols described here, the full genome of the 1918 virus at high coverage was determined in one high-throughput sequencing run of a cDNA library derived from total RNA of a 1918 FFPE sample after duplex-specific nuclease treatments. This basic methodological approach should assist in the analysis of FFPE tissue samples isolated over the past century from a variety of infectious diseases. PMID:26344216
Advanced Experimental Methods for Low-temperature Magnetotransport Measurement of Novel Materials
Hagmann, Joseph A.; Le, Son T.; Richter, Curt A.; Seiler, David G.
2016-01-01
Novel electronic materials are often produced for the first time by synthesis processes that yield bulk crystals (in contrast to single crystal thin film synthesis) for the purpose of exploratory materials research. Certain materials pose a challenge wherein the traditional bulk Hall bar device fabrication method is insufficient to produce a measureable device for sample transport measurement, principally because the single crystal size is too small to attach wire leads to the sample in a Hall bar configuration. This can be, for example, because the first batch of a new material synthesized yields very small single crystals or because flakes of samples of one to very few monolayers are desired. In order to enable rapid characterization of materials that may be carried out in parallel with improvements to their growth methodology, a method of device fabrication for very small samples has been devised to permit the characterization of novel materials as soon as a preliminary batch has been produced. A slight variation of this methodology is applicable to producing devices using exfoliated samples of two-dimensional materials such as graphene, hexagonal boron nitride (hBN), and transition metal dichalcogenides (TMDs), as well as multilayer heterostructures of such materials. Here we present detailed protocols for the experimental device fabrication of fragments and flakes of novel materials with micron-sized dimensions onto substrate and subsequent measurement in a commercial superconducting magnet, dry helium close-cycle cryostat magnetotransport system at temperatures down to 0.300 K and magnetic fields up to 12 T. PMID:26863449
Mallette, Jennifer R; Casale, John F
2014-10-17
The isomeric truxillines are a group of minor alkaloids present in all illicit cocaine samples. The relative amount of truxillines in cocaine is indicative of the variety of coca used for cocaine processing, and thus, is useful in source determination. Previously, the determination of isomeric truxillines in cocaine was performed with a gas chromatography/electron capture detection method. However, due to the tedious sample preparation as well as the expense and maintenance required of electron capture detectors, the protocol was converted to a gas chromatography/flame-ionization detection method. Ten truxilline isomers (alpha-, beta-, delta-, epsilon-, gamma-, omega, zeta-, peri-, neo-, and epi-) were quantified relative to a structurally related internal standard, 4',4″-dimethyl-α-truxillic acid dimethyl ester. The method was shown to have a linear response from 0.001 to 1.00 mg/mL and a lower detection limit of 0.001 mg/mL. In this method, the truxillines are directly reduced with lithium aluminum hydride and then acylated with heptafluorobutyric anhydride prior to analysis. The analysis of more than 100 cocaine hydrochloride samples is presented and compared to data obtained by the previous methodology. Authentic cocaine samples obtained from the source countries of Colombia, Bolivia, and Peru were also analyzed, and comparative data on more than 23,000 samples analyzed over the past 10 years with the previous methodology is presented. Published by Elsevier B.V.
Baumes, Laurent A
2006-01-01
One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.
Pastor, Antoni; Farré, Magí; Fitó, Montserrat; Fernandez-Aranda, Fernando; de la Torre, Rafael
2014-05-01
The analysis of peripheral endocannabinoids (ECs) is a good biomarker of the EC system. Their concentrations, from clinical studies, strongly depend on sample collection and time processing conditions taking place in clinical and laboratory settings. The analysis of 2-monoacylglycerols (MGs) (i.e., 2-arachidonoylglycerol or 2-oleoylglycerol) is a particularly challenging issue because of their ex vivo formation and chemical isomerization that occur after blood sample collection. We provide evidence that their ex vivo formation can be minimized by adding Orlistat, an enzymatic lipase inhibitor, to plasma. Taking into consideration the low cost of Orlistat, we recommend its addition to plasma collecting tubes while maintaining sample cold chain until storage. We have validated a method for the determination of the EC profile of a range of MGs and N-acylethanolamides in plasma that preserves the original isomer ratio of MGs. Nevertheless, the chemical isomerization of 2-MGs can only be avoided by an immediate processing and analysis of samples due to their instability during conservation. We believe that this new methodology can aid in the harmonization of the measurement of ECs and related compounds in clinical samples.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-08-01
The high-pressure processing conditions were optimized for pineapple puree within the domain of 400-600 MPa, 40-60 °C, and 10-20 min using the response surface methodology (RSM). The target was to maximize the inactivation of polyphenoloxidase (PPO) along with a minimal loss in beneficial bromelain (BRM) activity, ascorbic acid (AA) content, antioxidant capacity, and color in the sample. The optimum condition was 600 MPa, 50 °C, and 13 min, having the highest desirability of 0.604, which resulted in 44% PPO and 47% BRM activities. However, 93% antioxidant activity and 85% AA were retained in optimized sample with a total color change (∆E*) value less than 2.5. A 10-fold reduction in PPO activity was obtained at 600 MPa/70 °C/20 min; however, the thermal degradation of nutrients was severe at this condition. Fuzzy mathematical approach confirmed that sensory acceptance of the optimized sample was close to the fresh sample; whereas, the thermally pasteurized sample (treated at 0.1 MPa, 95 °C for 12 min) had the least sensory score as compared to others. © 2015 Institute of Food Technologists®
Microbial Characterization and Comparison of Isolates During the Mir and ISS Missions
NASA Technical Reports Server (NTRS)
Fontenot, Sondra L.; Castro, Victoria; Bruce, Rebekah; Ott, C. Mark; Pierson, Duane L.
2004-01-01
Spacecraft represent a semi-closed ecosystem that provides a unique model of microbial interaction with other microbes, potential hosts, and their environment. Environmental samples from the Mir Space Station (1995-1998) and the International Space Station (ISS) (2000-Present) were collected and processed to provide insight into the characterization of microbial diversity aboard spacecraft over time and assess any potential health risks to the crew. All microbiota were isolated using standard media-based methodologies. Isolates from Mir and ISS were processed using various methods of analysis, including VITEK biochemical analysis, 16s ribosomal identification, and fingerprinting using rep-PCR analysis. Over the first 41 months of habitation, the diversity of the microbiota from air and surface samples aboard ISS increased from an initial six to 53 different bacterial species. During the same period, fungal diversity increased from 2 to 24 species. Based upon rep-PCR analysis, the majority of isolates were unique suggesting the need for increased sampling frequency and a more thorough analysis of samples to properly characterize the ISS microbiota. This limited fungal and bacterial data from environmental samples acquired during monitoring currently do not indicate a microbial hazard to ISS or any trends suggesting potential health risks.
Eye-Tracking as a Tool in Process-Oriented Reading Test Validation
ERIC Educational Resources Information Center
Solheim, Oddny Judith; Uppstad, Per Henning
2011-01-01
The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…
Catchment-wide impacts on water quality: the use of 'snapshot' sampling during stable flow
NASA Astrophysics Data System (ADS)
Grayson, R. B.; Gippel, C. J.; Finlayson, B. L.; Hart, B. T.
1997-12-01
Water quality is usually monitored on a regular basis at only a small number of locations in a catchment, generally focused at the catchment outlet. This integrates the effect of all the point and non-point source processes occurring throughout the catchment. However, effective catchment management requires data which identify major sources and processes. As part of a wider study aimed at providing technical information for the development of integrated catchment management plans for a 5000 km 2 catchment in south eastern Australia, a 'snapshot' of water quality was undertaken during stable summer flow conditions. These low flow conditions exist for long periods so water quality at these flow levels is an important constraint on the health of in-stream biological communities. Over a 4 day period, a study of the low flow water quality characteristics throughout the Latrobe River catchment was undertaken. Sixty-four sites were chosen to enable a longitudinal profile of water quality to be established. All tributary junctions and sites along major tributaries, as well as all major industrial inputs were included. Samples were analysed for a range of parameters including total suspended solids concentration, pH, dissolved oxygen, electrical conductivity, turbidity, flow rate and water temperature. Filtered and unfiltered samples were taken from 27 sites along the main stream and tributary confluences for analysis of total N, NH 4, oxidised N, total P and dissolved reactive P concentrations. The data are used to illustrate the utility of this sampling methodology for establishing specific sources and estimating non-point source loads of phosphorous, total suspended solids and total dissolved solids. The methodology enabled several new insights into system behaviour including quantification of unknown point discharges, identification of key in-stream sources of suspended material and the extent to which biological activity (phytoplankton growth) affects water quality. The costs and benefits of the sampling exercise are reviewed.
Frías-De-León, María Guadalupe; Ramírez-Bárcenas, José Antonio; Rodríguez-Arellanes, Gabriela; Velasco-Castrejón, Oscar; Taylor, Maria Lucia; Reyes-Montes, María Del Rocío
2017-03-01
Histoplasmosis is considered the most important systemic mycosis in Mexico, and its diagnosis requires fast and reliable methodologies. The present study evaluated the usefulness of PCR using Hcp100 and 1281-1283 (220) molecular markers in detecting Histoplasma capsulatum in occupational and recreational outbreaks. Seven clinical serum samples of infected individuals from three different histoplasmosis outbreaks were processed by enzyme-linked immunosorbent assay (ELISA) to titre anti-H. capsulatum antibodies and to extract DNA. Fourteen environmental samples were also processed for H. capsulatum isolation and DNA extraction. Both clinical and environmental DNA samples were analysed by PCR with Hcp100 and 1281-1283 (220) markers. Antibodies to H. capsulatum were detected by ELISA in all serum samples using specific antigens, and in six of these samples, the PCR products of both molecular markers were amplified. Four environmental samples amplified one of the two markers, but only one sample amplified both markers and an isolate of H. capsulatum was cultured from this sample. All PCR products were sequenced, and the sequences for each marker were analysed using the Basic Local Alignment Search Tool (BLASTn), which revealed 95-98 and 98-100 % similarities with the reference sequences deposited in the GenBank for Hcp100 and 1281-1283 (220) , respectively. Both molecular markers proved to be useful in studying histoplasmosis outbreaks because they are matched for pathogen detection in either clinical or environmental samples.
NASA Technical Reports Server (NTRS)
Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.
2003-01-01
This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.
THE IMPACT OF PASSIVE SAMPLING METHODOLOGIES USED IN THE DEARS
This abstract details the use of passive sampling methodologies in the Detroit Exposure and Aerosol Research Study (DEARS). A discussion about the utility of various gas-phase passive samplers used in the study will be described along with examples of field data measurements empl...
Xu, Jie; Hu, Feng-Lin; Wang, Wei; Wan, Xiao-Chun; Bao, Guan-Hu
2015-11-01
Fu brick tea (FBT) is a unique post-fermented tea product which is fermented with fungi during the manufacturing process. In this study, we investigated the biochemical compositional changes occurring during the microbial fermentation process (MFP) of FBT based on non-targeted LC-MS, which was a comprehensive and unbiased methodology. Our data analysis took a two-phase approach: (1) comparison of FBT with other tea products using PCA analysis to exhibit the characteristic effect of MFP on the formation of Fu brick tea and (2) comparison of tea samples throughout the MFP of FBT to elucidate the possible key metabolic pathways produced by the fungi. Non-targeted LC-MS analysis clearly distinguished FBT with other tea samples and highlighted some interesting metabolic pathways during the MFP including B ring fission catechin. Our study demonstrated that those fungi had a significant influence on the biochemical profiles in the FBT and consequently contributed to its unique quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Assessment of ecologic regression in the study of lung cancer and indoor radon.
Stidley, C A; Samet, J M
1994-02-01
Ecologic regression studies conducted to assess the cancer risk of indoor radon to the general population are subject to methodological limitations, and they have given seemingly contradictory results. The authors use simulations to examine the effects of two major methodological problems that affect these studies: measurement error and misspecification of the risk model. In a simulation study of the effect of measurement error caused by the sampling process used to estimate radon exposure for a geographic unit, both the effect of radon and the standard error of the effect estimate were underestimated, with greater bias for smaller sample sizes. In another simulation study, which addressed the consequences of uncontrolled confounding by cigarette smoking, even small negative correlations between county geometric mean annual radon exposure and the proportion of smokers resulted in negative average estimates of the radon effect. A third study considered consequences of using simple linear ecologic models when the true underlying model relation between lung cancer and radon exposure is nonlinear. These examples quantify potential biases and demonstrate the limitations of estimating risks from ecologic studies of lung cancer and indoor radon.
Effective Pb2+ removal from water using nanozerovalent iron stored 10 months
NASA Astrophysics Data System (ADS)
Ahmed, M. A.; Bishay, Samiha T.; Ahmed, Fatma M.; El-Dek, S. I.
2017-10-01
Heavy metal removal from water required reliable and cost-effective considerations, fast separation as well as easy methodology. In this piece of research, nanozerovalent iron (NZVI) was prepared as ideal sorbent for Pb2+ removal. The sample was characterized using X-ray diffraction (XRD), high-resolution transmission electron microscope (HRTEM), and atomic force microscope (AFM-SPM). Batch experiments comprised the effect of pH value and contact time on the adsorption process. The same NZVI was stored for a shelf time (10 months) and the batch experiment was repeated. The outcomes of the investigation assured that NZVI publicized an extraordinary large metal uptake (98%) after a short contact time (10 h). The stored sample revealed the same effectiveness on Pb2+ removal under the same conditions. The results of the physical properties, magnetic susceptibility, and conductance were correlated with the adsorption efficiency. This work offers evidence that these NZVI particles could be potential candidate for Pb2+ removal in large scale, stored for a long time using a simple, green, and cost-effective methodology, and represent an actual feedback in waste water treatment.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
2016-05-01
and Kroeger (2002) provide details on sampling and weighting. Following the summary of the survey methodology is a description of the survey analysis... description of priority, for the ADDRESS file). At any given time, the current address used corresponded to the address number with the highest priority...types of address updates provided by the postal service. They are detailed below; each includes a description of the processing steps. 1. Postal Non
LMI designmethod for networked-based PID control
NASA Astrophysics Data System (ADS)
Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez
2016-10-01
In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.
Saletti, Dominique
2017-01-01
Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505
Ciapała, Szymon; Adamski, Paweł
2015-01-01
Intensification of pedestrian tourism causes damage to trees near tourist tracks, and likewise changes the soil structure. As a result, one may expect that annual amount of trees growing near tracks is significantly lower than deeper in the forest. However, during the study of the long-term impact of tourism on the environment (determined from tree increment dynamics), some methodological problems may occur. It is particularly important in protected areas where law and administrative regulations related to nature conservation force research to be conducted using small samples. In this paper we have analyzed the data collected in the Polish part of the Tatra National Park in the two study plots divided into two zones each: the area directly under the influence of the tourist's trampling and the control group. The aim of such analyses was to present the potential effects of the factors which may affect the results of dendrochronological analysis: (i) small size of samples that affects their representativeness, (ii) spatial differences in the rates of the process, as a result of spatial variability of environmental factors and (iii) temporal differences in the rates of the process. This study confirms that the factors mentioned above could significantly influence the results and should be taken into consideration during the analysis. PMID:26325062
Information technology security system engineering methodology
NASA Technical Reports Server (NTRS)
Childs, D.
2003-01-01
A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
[Theoretical and methodological uses of research in Social and Human Sciences in Health].
Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein
2012-12-01
The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.
Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach
Hofmans, Joeri
2017-01-01
A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories—in the form of the dynamic model of the psychological contract—and research methods—in the form of daily diary research and experience sampling research—are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models—the Zero-Inflated model and the Hurdle model—that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue. PMID:29163316
Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach.
Hofmans, Joeri
2017-01-01
A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories-in the form of the dynamic model of the psychological contract-and research methods-in the form of daily diary research and experience sampling research-are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models-the Zero-Inflated model and the Hurdle model-that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.
Espiñeira, Montserrat; Vieites, Juan M
2012-12-15
The TaqMan real-time PCR has the highest potential for automation, therefore representing the currently most suitable method for screening, allowing the detection of fraudulent or unintentional mislabeling of species. This work describes the development of a real-time polymerase chain reaction (RT-PCR) system for the detection and identification of common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas). This technique is notable for the combination of simplicity, speed, sensitivity and specificity in an homogeneous assay. The method can be applied to all kinds of products; fresh, frozen and processed, including those undergoing intensive processes of transformation. This methodology was validated to check how the degree of food processing affects the method and the detection of each species. Moreover, it was applied to 34 commercial samples to evaluate the labeling of products made from them. The methodology herein developed is useful to check the fulfillment of labeling regulations for seafood products and to verify traceability in commercial trade and for fisheries control. Copyright © 2012 Elsevier Ltd. All rights reserved.
77 FR 4002 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
... the methodological research previously included in the original System of Record Notice (SORN). This... methodological research on improving various aspects of surveys authorized by Title 13, U.S.C. 8(b), 182, and 196, such as: survey sampling frame design; sample selection algorithms; questionnaire development, design...
An Approach to the Use of Depth Cameras for Weed Volume Estimation
Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela
2016-01-01
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them. PMID:27347972
An Approach to the Use of Depth Cameras for Weed Volume Estimation.
Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela
2016-06-25
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.
Robust model selection and the statistical classification of languages
NASA Astrophysics Data System (ADS)
García, J. E.; González-López, V. A.; Viola, M. L. L.
2012-10-01
In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset.
NASA Astrophysics Data System (ADS)
Knight, Travis Warren
Nuclear thermal propulsion (NTP) and space nuclear power are two enabling technologies for the manned exploration of space and the development of research outposts in space and on other planets such as Mars. Advanced carbide nuclear fuels have been proposed for application in space nuclear power and propulsion systems. This study examined the processing technologies and optimal parameters necessary to fabricate samples of single phase, solid solution, mixed uranium/refractory metal carbides. In particular, the pseudo-ternary carbide, UC-ZrC-NbC, system was examined with uranium metal mole fractions of 5% and 10% and corresponding uranium densities of 0.8 to 1.8 gU/cc. Efforts were directed to those methods that could produce simple geometry fuel elements or wafers such as those used to fabricate a Square Lattice Honeycomb (SLHC) fuel element and reactor core. Methods of cold uniaxial pressing, sintering by induction heating, and hot pressing by self-resistance heating were investigated. Solid solution, high density (low porosity) samples greater than 95% TD were processed by cold pressing at 150 MPa and sintering above 2600 K for times longer than 90 min. Some impurity oxide phases were noted in some samples attributed to residual gases in the furnace during processing. Also, some samples noted secondary phases of carbon and UC2 due to some hyperstoichiometric powder mixtures having carbon-to-metal ratios greater than one. In all, 33 mixed carbide samples were processed and analyzed with half bearing uranium as ternary carbides of UC-ZrC-NbC. Scanning electron microscopy, x-ray diffraction, and density measurements were used to characterize samples. Samples were processed from powders of the refractory mono-carbides and UC/UC 2 or from powders of uranium hydride (UH3), graphite, and refractory metal carbides to produce hypostoichiometric mixed carbides. Samples processed from the constituent carbide powders and sintered at temperatures above the melting point of UC showed signs of liquid phase sintering and were shown to be largely solid solutions. Pre-compaction of mixed carbide powders prior to sintering was shown to be necessary to achieve high densities. Hypostoichiometric, samples processed at 2500 K exhibited only the initial stage of sintering and solid solution formation. Based on these findings, a suggested processing methodology is proposed for producing high density, solid solution, mixed carbide fuels. Pseudo-binary, refractory carbide samples hot pressed at 3100 K and 6 MPa showed comparable densities (approximately 85% of the theoretical value) to samples processed by cold pressing and sintering at temperatures of 2800 K.
Eye-gaze determination of user intent at the computer interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-12-31
Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less
PepsNMR for 1H NMR metabolomic data pre-processing.
Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette
2018-08-17
In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.
Flow Cytometry: Impact on Early Drug Discovery.
Edwards, Bruce S; Sklar, Larry A
2015-07-01
Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens of thousands of cells per second and more than five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, "sip-and-spit" sampling technology has restricted it to low-sample-throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens of thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multiparameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage, and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry, and parallel sample processing promise dramatically expanded single-cell profiling capabilities to bolster systems-level approaches to drug discovery. © 2015 Society for Laboratory Automation and Screening.
Flow Cytometry: Impact On Early Drug Discovery
Edwards, Bruce S.; Sklar, Larry A.
2015-01-01
Summary Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens-of-thousands of cells per second and over five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, “sip-and-spit” sampling technology has restricted it to low sample throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens-of-thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multi-parameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry and parallel sample processing promise dramatically expanded single cell profiling capabilities to bolster systems level approaches to drug discovery. PMID:25805180
Rutty, Guy N; Barber, Jade; Amoroso, Jasmin; Morgan, Bruno; Graham, Eleanor A M
2013-12-01
Post-mortem computed tomography angiography (PMCTA) involves the injection of contrast agents. This could have both a dilution effect on biological fluid samples and could affect subsequent post-contrast analytical laboratory processes. We undertook a small sample study of 10 targeted and 10 whole body PMCTA cases to consider whether or not these two methods of PMCTA could affect post-PMCTA cadaver blood based DNA identification. We used standard methodology to examine DNA from blood samples obtained before and after the PMCTA procedure. We illustrate that neither of these PMCTA methods had an effect on the alleles called following short tandem repeat based DNA profiling, and therefore the ability to undertake post-PMCTA blood based DNA identification.
NASA Astrophysics Data System (ADS)
Wiśniewska, Paulina; Boqué, Ricard; Borràs, Eva; Busto, Olga; Wardencki, Waldemar; Namieśnik, Jacek; Dymerski, Tomasz
2017-02-01
Headspace mass-spectrometry (HS-MS), mid infrared (MIR) and UV-vis spectroscopy were used to authenticate whisky samples from different origins and ways of production ((Irish, Spanish, Bourbon, Tennessee Whisky and Scotch). The collected spectra were processed with partial least-squares discriminant analysis (PLS-DA) to build the classification models. In all cases the five groups of whiskies were distinguished, but the best results were obtained by HS-MS, which indicates that the biggest differences between different types of whisky are due to their aroma. Differences were also found inside groups, showing that not only raw material is important to discriminate samples but also the way of their production. The methodology is quick, easy and does not require sample preparation.
NASA Astrophysics Data System (ADS)
Chappell, N. A.; Jones, T.; Young, P.; Krishnaswamy, J.
2015-12-01
There is increasing awareness that under-sampling may have resulted in the omission of important physicochemical information present in water quality signatures of surface waters - thereby affecting interpretation of biogeochemical processes. For dissolved organic carbon (DOC) and nitrogen this under-sampling can now be avoided using UV-visible spectroscopy measured in-situ and continuously at a fine-resolution e.g. 15 minutes ("real time"). Few methods are available to extract biogeochemical process information directly from such high-frequency data. Jones, Chappell & Tych (2014 Environ Sci Technol: 13289-97) developed one such method using optically-derived DOC data based upon a sophisticated time-series modelling tool. Within this presentation we extend the methodology to quantify the minimum sampling interval required to avoid distortion of model structures and parameters that describe fundamental biogeochemical processes. This shifting of parameters which results from under-sampling is called "aliasing". We demonstrate that storm dynamics at a variety of sites dominate over diurnal and seasonal changes and that these must be characterised by sampling that may be sub-hourly to avoid aliasing. This is considerably shorter than that used by other water quality studies examining aliasing (e.g. Kirchner 2005 Phys Rev: 069902). The modelling approach presented is being developed into a generic tool to calculate the minimum sampling for water quality monitoring in systems driven primarily by hydrology. This is illustrated with fine-resolution, optical data from watersheds in temperate Europe through to the humid tropics.
Optimizing cord blood sample cryopreservation.
Harris, David T
2012-03-01
Cord blood (CB) banking is becoming more and more commonplace throughout the medical community, both in the USA and elsewhere. It is now generally recognized that storage of CB samples in multiple aliquots is the preferred approach to banking because it allows the greatest number of uses of the sample. However, it is unclear which are the best methodologies for cryopreservation and storage of the sample aliquots. In the current study we analyzed variables that could affect these processes. CB were processed into mononuclear cells (MNC) and frozen in commercially available human serum albumin (HSA) or autologous CB plasma using cryovials of various sizes and cryobags. The bacteriophage phiX174 was used as a model virus to test for cross-contamination. We observed that cryopreservation of CB in HSA, undiluted autologous human plasma and 50% diluted plasma was equivalent in terms of cell recovery and cell viability. We also found that cryopreservation of CB samples in either cryovials or cryobags displayed equivalent thermal characteristics. Finally, we demonstrated that overwrapping the CB storage container in an impermeable plastic sheathing was sufficient to prevent cross-sample viral contamination during prolonged storage in the liquid phase of liquid nitrogen dewar storage. CB may be cryopreserved in either vials or bags without concern for temperature stability. Sample overwrapping is sufficient to prevent microbiologic contamination of the samples while in liquid-phase liquid nitrogen storage.
Leitner, Simon; Reichenauer, Thomas G; Watzinger, Andrea
2018-02-15
The evaluation of groundwater contaminant e.g. tetrachloroethene (PCE) degradation processes requires complete quantification of and pathway analysis of the groundwater contaminant under investigation. For example the reduction of PCE concentrations in the groundwater by unknown dissolution and/or sorption processes will impede interpretation of the fate and behaviour of such contaminants. In the present study PCE dissolution and sorption processes during anaerobic microbial degradation of chlorinated ethenes were investigated. For this purpose, microcosms were prepared using sediment samples from a PCE-contaminated aquifer, which in previous studies had demonstrated anaerobic organohalide respiration of PCE. Solid/water distribution coefficients (k d ) of PCE were determined and validated by loss-on-ignition (LOI) and PCE sorption experiments. The determined k d magnitudes indicated methodological congruency, yielding values for sediment samples within a range of 1.15±0.02 to 5.93±0.34L·kg -1 . The microcosm experiment showed lower PCE concentrations than expected, based on spiked PCE and observed anaerobic microbial degradation processes. Nevertheless the amount of PCE spike added was completely recovered albeit in the form of lower chlorinated metabolites. A delay due to dissolution processes was not responsible for this phenomenon. Sorption to sediments could only partially explain the reduction of PCE in the water phase. Accordingly, the results point to reversible sorption processes of PCE, possibly onto bacterial cell compartments and/or exopolymeric substances. Copyright © 2017 Elsevier B.V. All rights reserved.
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
Validation of a sampling plan to generate food composition data.
Sammán, N C; Gimenez, M A; Bassett, N; Lobo, M O; Marcoleri, M E
2016-02-15
A methodology to develop systematic plans for food sampling was proposed. Long life whole and skimmed milk, and sunflower oil were selected to validate the methodology in Argentina. Fatty acid profile in all foods, proximal composition, and calcium's content in milk were determined with AOAC methods. The number of samples (n) was calculated applying Cochran's formula with variation coefficients ⩽12% and an estimate error (r) maximum permissible ⩽5% for calcium content in milks and unsaturated fatty acids in oil. n were 9, 11 and 21 for long life whole and skimmed milk, and sunflower oil respectively. Sample units were randomly collected from production sites and sent to labs. Calculated r with experimental data was ⩽10%, indicating high accuracy in the determination of analyte content of greater variability and reliability of the proposed sampling plan. The methodology is an adequate and useful tool to develop sampling plans for food composition analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj
2011-01-01
Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985
Robust Learning Control Design for Quantum Unitary Transformations.
Wu, Chengzhi; Qi, Bo; Chen, Chunlin; Dong, Daoyi
2017-12-01
Robust control design for quantum unitary transformations has been recognized as a fundamental and challenging task in the development of quantum information processing due to unavoidable decoherence or operational errors in the experimental implementation of quantum operations. In this paper, we extend the systematic methodology of sampling-based learning control (SLC) approach with a gradient flow algorithm for the design of robust quantum unitary transformations. The SLC approach first uses a "training" process to find an optimal control strategy robust against certain ranges of uncertainties. Then a number of randomly selected samples are tested and the performance is evaluated according to their average fidelity. The approach is applied to three typical examples of robust quantum transformation problems including robust quantum transformations in a three-level quantum system, in a superconducting quantum circuit, and in a spin chain system. Numerical results demonstrate the effectiveness of the SLC approach and show its potential applications in various implementation of quantum unitary transformations.
NASA Astrophysics Data System (ADS)
Zabolotna, Natalia I.; Dovhaliuk, Rostyslav Y.
2013-09-01
We present a novel measurement method of optic axes orientation distribution which uses a relatively simple measurement setup. The principal difference of our method from other well-known methods lies in direct approach for measuring the orientation of optical axis of polycrystalline networks biological crystals. Our test polarimetry setup consists of HeNe laser, quarter wave plate, two linear polarizers and a CCD camera. We also propose a methodology for processing of measured optic axes orientation distribution which consists of evaluation of statistical, correlational and spectral moments. Such processing of obtained data can be used to classify particular tissue sample as "healthy" or "pathological". For our experiment we use thin layers of histological section of normal and muscular dystrophy tissue sections. It is shown that the difference between mentioned moments` values of normal and pathological samples can be quite noticeable with relative difference up to 6.26.
Tucker, Jalie A; Simpson, Cathy A; Chandler, Susan D; Borch, Casey A; Davies, Susan L; Kerbawy, Shatomi J; Lewis, Terri H; Crawford, M Scott; Cheong, JeeWon; Michael, Max
2016-01-01
Emerging adulthood often entails heightened risk-taking with potential life-long consequences, and research on risk behaviors is needed to guide prevention programming, particularly in under-served and difficult to reach populations. This study evaluated the utility of Respondent Driven Sampling (RDS), a peer-driven methodology that corrects limitations of snowball sampling, to reach at-risk African American emerging adults from disadvantaged urban communities. Initial "seed" participants from the target group recruited peers, who then recruited their peers in an iterative process (110 males, 234 females; M age = 18.86 years). Structured field interviews assessed common health risk factors, including substance use, overweight/obesity, and sexual behaviors. Established gender-and age-related associations with risk factors were replicated, and sample risk profiles and prevalence estimates compared favorably with matched samples from representative U.S. national surveys. Findings supported the use of RDS as a sampling method and grassroots platform for research and prevention with community-dwelling risk groups.
Falkenhaug, Tone; Baxter, Emily J.
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data. PMID:29095891
Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data.
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-09-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.
Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes
2015-01-01
Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655
Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter
2014-10-01
Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.
Gonzalez, Edurne; Tollan, Christopher; Chuvilin, Andrey; Barandiaran, Maria J; Paulis, Maria
2012-08-01
A new methodology for quantitative characterization of the coalescence process of waterborne polymer dispersion (latex) particles by environmental scanning electron microscopy (ESEM) is proposed. The experimental setup has been developed to provide reproducible latex monolayer depositions, optimized contrast of the latex particles, and a reliable readout of the sample temperature. Quantification of the coalescence process under dry conditions has been performed by image processing based on evaluation of the image autocorrelation function. As a proof of concept the coalescence of two latexes with known and differing glass transition temperatures has been measured. It has been shown that a reproducibility of better than 1.5 °C can be obtained for the measurement of the coalescence temperature.
Primary care research conducted in networks: getting down to business.
Mold, James W
2012-01-01
This seventh annual practice-based research theme issue of the Journal of the American Board of Family Medicine highlights primary care research conducted in practice-based research networks (PBRNs). The issue includes discussion of (1) theoretical and methodological research, (2) health care research (studies addressing primary care processes), (3) clinical research (studies addressing the impact of primary care on patients), and (4) health systems research (studies of health system issues impacting primary care including the quality improvement process). We had a noticeable increase in submissions from PBRN collaborations, that is, studies that involved multiple networks. As PBRNs cooperate to recruit larger and more diverse patient samples, greater generalizability and applicability of findings lead to improved primary care processes.
Comparative study of submerged and surface culture acetification process for orange vinegar.
Cejudo-Bastante, Cristina; Durán-Guerrero, Enrique; García-Barroso, Carmelo; Castro-Mejías, Remedios
2018-02-01
The two main acetification methodologies generally employed in the production of vinegar (surface and submerged cultures) were studied and compared for the production of orange vinegar. Polyphenols (UPLC/DAD) and volatiles compounds (SBSE-GC/MS) were considered as the main variables in the comparative study. Sensory characteristics of the obtained vinegars were also evaluated. Seventeen polyphenols and 24 volatile compounds were determined in the samples during both acetification processes. For phenolic compounds, analysis of variance showed significant higher concentrations when surface culture acetification was employed. However, for the majority of volatile compounds higher contents were observed for submerged culture acetification process, and it was also reflected in the sensory analysis, presenting higher scores for the different descriptors. Multivariate statistical analysis such as principal component analysis demonstrated the possibility of discriminating the samples regarding the type of acetification process. Polyphenols such as apigenin derivative or ferulic acid and volatile compounds such as 4-vinylguaiacol, decanoic acid, nootkatone, trans-geraniol, β-citronellol or α-terpineol, among others, were those compounds that contributed more to the discrimination of the samples. The acetification process employed in the production of orange vinegar has been demonstrated to be very significant for the final characteristics of the vinegar obtained. So it must be carefully controlled to obtain high quality products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilzbach, K. E.; Stetter, J. R.; Reilly, Jr., C. A.
1982-02-01
A collaborative environmental research program to provide information needed to assess the health and environmental effects associated with large-scale coal gasification technology is being conducted by Argonne National Laboratory (ANL) and the Grand Forks Energy Technology Center (GFETC). The objectives are to: investigate the toxicology and chemical composition of coal gasification by-products as a function of process variables and coal feed; compare the characteristics of isokinetic side-stream samples with those of process stream samples; identify the types of compounds responsible for toxicity; evaluate the chemical and toxicological effectiveness of various wastewater treatment operations; refine methodology for the collection and measurementmore » of organic vapors and particulates in workplace air; and obtain preliminary data on workplace air quality. So far the toxicities of a set of process stream samples (tar, oil, and gas liquor) and side-stream condensates from the GFETC gasifier have been measured in a battery of cellular screening tests for mutagenicity and cytotoxicity. Preliminary data on the effects of acute and chronic exposures of laboratory animals to process tar have been obtained. The process tar has been chemically fractionated and the distribution of mutagenicity and compound types among the fractions has been determined. Organic vapors and particulates collected at various times and locations in the gasifier building have been characterized.« less
Generating or developing grounded theory: methods to understand health and illness.
Woods, Phillip; Gapp, Rod; King, Michelle A
2016-06-01
Grounded theory is a qualitative research methodology that aims to explain social phenomena, e.g. why particular motivations or patterns of behaviour occur, at a conceptual level. Developed in the 1960s by Glaser and Strauss, the methodology has been reinterpreted by Strauss and Corbin in more recent times, resulting in different schools of thought. Differences arise from different philosophical perspectives concerning knowledge (epistemology) and the nature of reality (ontology), demanding that researchers make clear theoretical choices at the commencement of their research when choosing this methodology. Compared to other qualitative methods it has ability to achieve understanding of, rather than simply describing, a social phenomenon. Achieving understanding however, requires theoretical sampling to choose interviewees that can contribute most to the research and understanding of the phenomenon, and constant comparison of interviews to evaluate the same event or process in different settings or situations. Sampling continues until conceptual saturation is reached, i.e. when no new concepts emerge from the data. Data analysis focusses on categorising data (finding the main elements of what is occurring and why), and describing those categories in terms of properties (conceptual characteristics that define the category and give meaning) and dimensions (the variations within properties which produce specificity and range). Ultimately a core category which theoretically explains how all other categories are linked together is developed from the data. While achieving theoretical abstraction in the core category, it should be logical and capture all of the variation within the data. Theory development requires understanding of the methodology not just working through a set of procedures. This article provides a basic overview, set in the literature surrounding grounded theory, for those wanting to increase their understanding and quality of research output.
77 FR 7109 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... assay (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six... loss of the only commercially available antigen-detection ELISA filovirus testing facility. Currently... current methodology (ELISA) used to test NHP liver samples. This cost determines the amount of the user...
Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P
2018-01-01
Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.
Jesús, Florencia; Hladki, Ricardo; Gérez, Natalia; Besil, Natalia; Niell, Silvina; Fernández, Grisel; Heinzen, Horacio; Cesio, María Verónica
2018-02-01
The impacts of the modern, agrochemicals based agriculture that threatens the overall systems sustainability, need to be monitored and evaluated. Seeking for agroecosystems monitors, the present article focus in the occurrence and abundance of aquatic macroinvertebrates, that have been frequently used as bioindicators of water quality due to their relationship with land use. Some of these organisms are on the top of the food chain, where bioaccumulation and biomagnification processes can be observed, and they can turn into secondary pollution sources of systems and terrestrial organisms as well. Odonate nymphs, which belong to the functional group of predators, were selected for this study. A methodology to determine 73 pesticide residues in odonate nymphs by LC-MS/MS and GC-MS/MS was developed. A QuEChERS sample preparation strategy was adapted. As it is complex to obtain samples especially in disturbed ecosystems, the method was minimized to a sample size of 200mg of fresh nymphs. The method was validated and good recoveries (71-120%) with RSDs below 20% for the majority of the studied pesticides at least at two of the assayed levels 1, 10 and 50µgkg -1 were obtained. For 32 analytes the limit of quantitation was 1µgkg -1 and 10µgkg -1 for the others. The lineal range was observed between 1-100µgkg -1 in matrix-matched and solvent calibration curves for most of the assessed pesticides. LC-MS/MS matrix effects were evaluated, 40% of the analytes presented low or no signal suppression. Only flufenoxuron presented high matrix effects. The obtained methodology is adequate for pesticide multiresidue analysis in aquatic macroinvertebrates (odonates) aiming to contribute to the ecological state evaluation of freshwater ecosystems. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Fengrui; Adeola, Olayiwola
2017-12-01
Sound feed formulation is dependent upon precise evaluation of energy and nutrients values in feed ingredients. Hence the methodology to determine the digestibility of energy and nutrients in feedstuffs should be chosen carefully before conducting experiments. The direct and difference procedures are widely used to determine the digestibility of energy and nutrients in feedstuffs. The direct procedure is normally considered when the test feedstuff can be formulated as the sole source of the component of interest in the test diet. However, in some cases where test ingredients can only be formulated to replace a portion of the basal diet to provide the component of interest, the difference procedure can be applied to get equally robust values. Based on components of interest, ileal digesta or feces can be collected, and different sample collection processes can be used. For example, for amino acids (AA), to avoid the interference of fermentation in the hind gut, ileal digesta samples are collected to determine the ileal digestibility and simple T-cannula and index method are commonly used techniques for AA digestibility analysis. For energy, phosphorus, and calcium, normally fecal samples will be collected to determine the total tract digestibility, and therefore the total collection method is recommended to obtain more accurate estimates. Concerns with the use of apparent digestibility values include different estimated values from different inclusion level and non-additivity in mixtures of feed ingredients. These concerns can be overcome by using standardized digestibility, or true digestibility, by correcting endogenous losses of components from apparent digestibility values. In this review, methodologies used to determine energy and nutrients digestibility in pigs are discussed. It is suggested that the methodology should be carefully selected based on the component of interest, feed ingredients, and available experimental facilities.
Heller, Melina; Vitali, Luciano; Oliveira, Marcone Augusto Leal; Costa, Ana Carolina O; Micke, Gustavo Amadeu
2011-07-13
The present study aimed to develop a methodology using capillary electrophoresis for the determination of sinapaldehyde, syringaldehyde, coniferaldehyde, and vanillin in whiskey samples. The main objective was to obtain a screening method to differentiate authentic samples from seized samples suspected of being false using the phenolic aldehydes as chemical markers. The optimized background electrolyte was composed of 20 mmol L(-1) sodium tetraborate with 10% MeOH at pH 9.3. The study examined two kinds of sample stacking, using a long-end injection mode: normal sample stacking (NSM) and sample stacking with matrix removal (SWMR). In SWMR, the optimized injection time of the samples was 42 s (SWMR42); at this time, no matrix effects were observed. Values of r were >0.99 for the both methods. The LOD and LOQ were better than 100 and 330 mg mL(-1) for NSM and better than 22 and 73 mg L(-1) for SWMR. The CE-UV reliability in the aldehyde analysis in the real sample was compared statistically with LC-MS/MS methodology, and no significant differences were found, with a 95% confidence interval between the methodologies.
Kinetics and thermodynamics of gas diffusion in a NiFe hydrogenase.
Topin, Jérémie; Rousset, Marc; Antonczak, Serge; Golebiowski, Jérôme
2012-03-01
We have investigated O₂ and H₂ transport across a NiFe hydrogenase at the atomic scale by means of computational methods. The Wild Type protein has been compared with the V74Q mutant. Two distinct methodologies have been applied to study the gas access to the active site. Temperature locally enhanced sampling simulations have emphasized the importance of protein dynamics on gas diffusion. The O₂ diffusion free energy profiles, obtained by umbrella sampling, are in agreement with the known kinetic data and show that in the V74Q mutant, the inhibition process is lowered from both a kinetic and a thermodynamic point of view. Copyright © 2011 Wiley Periodicals, Inc.
Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.
Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M
2018-06-13
This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.
NASA Technical Reports Server (NTRS)
Buntine, Wray L.
1995-01-01
Intelligent systems require software incorporating probabilistic reasoning, and often times learning. Networks provide a framework and methodology for creating this kind of software. This paper introduces network models based on chain graphs with deterministic nodes. Chain graphs are defined as a hierarchical combination of Bayesian and Markov networks. To model learning, plates on chain graphs are introduced to model independent samples. The paper concludes by discussing various operations that can be performed on chain graphs with plates as a simplification process or to generate learning algorithms.
2010-01-01
Background Histologic samples all funnel through the H&E microtomy staining area. Here manual processes intersect with semi-automated processes creating a bottleneck. We compare alternate work processes in anatomic pathology primarily in the H&E staining work cell. Methods We established a baseline measure of H&E process impact on personnel, information management and sample flow from historical workload and production data and direct observation. We compared this to performance after implementing initial Lean process modifications, including workstation reorganization, equipment relocation and workflow levelling, and the Ventana Symphony stainer to assess the impact on productivity in the H&E staining work cell. Results Average time from gross station to assembled case decreased by 2.9 hours (12%). Total process turnaround time (TAT) exclusive of processor schedule changes decreased 48 minutes/case (4%). Mean quarterly productivity increased 8.5% with the new methods. Process redesign reduced the number of manual steps from 219 to 182, a 17% reduction. Specimen travel distance was reduced from 773 ft/case to 395 ft/case (49%) overall, and from 92 to 53 ft/case in the H&E cell (42% improvement). Conclusions Implementation of Lean methods in the H&E work cell of histology can result in improved productivity, improved through-put and case availability parameters including TAT. PMID:20181123
Scott, Anna Mae; Hofmann, Björn; Gutiérrez-Ibarluzea, Iñaki; Bakke Lysdahl, Kristin; Sandman, Lars; Bombard, Yvonne
2017-01-01
Introduction: Assessment of ethics issues is an important part of health technology assessments (HTA). However, in terms of existence of quality assessment tools, ethics for HTA is methodologically underdeveloped in comparison to other areas of HTA, such as clinical or cost effectiveness. Objective: To methodologically advance ethics for HTA by: (1) proposing and elaborating Q-SEA, the first instrument for quality assessment of ethics analyses, and (2) applying Q-SEA to a sample systematic review of ethics for HTA, in order to illustrate and facilitate its use. Methods: To develop a list of items for the Q-SEA instrument, we systematically reviewed the literature on methodology in ethics for HTA, reviewed HTA organizations’ websites, and solicited views from 32 experts in the field of ethics for HTA at two 2-day workshops. We subsequently refined Q-SEA through its application to an ethics analysis conducted for HTA. Results: Q-SEA instrument consists of two domains – the process domain and the output domain. The process domain consists of 5 elements: research question, literature search, inclusion/exclusion criteria, perspective, and ethics framework. The output domain consists of 5 elements: completeness, bias, implications, conceptual clarification, and conflicting values. Conclusion: Q-SEA is the first instrument for quality assessment of ethics analyses in HTA. Further refinements to the instrument to enhance its usability continue. PMID:28326147
Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große
2012-01-01
This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338
An evaluation of total starch and starch gelatinization methodologies in pelleted animal feed.
Zhu, L; Jones, C; Guo, Q; Lewis, L; Stark, C R; Alavi, S
2016-04-01
The quantification of total starch content (TS) or degree of starch gelatinization (DG) in animal feed is always challenging because of the potential interference from other ingredients. In this study, the differences in TS or DG measurement in pelleted swine feed due to variations in analytical methodology were quantified. Pelleted swine feed was used to create 6 different diets manufactured with various processing conditions in a 2 × 3 factorial design (2 conditioning temperatures, 77 or 88°C, and 3 conditioning retention times, 15, 30, or 60 s). Samples at each processing stage (cold mash, hot mash, hot pelletized feed, and final cooled pelletized feed) were collected for each of the 6 treatments and analyzed for TS and DG. Two different methodologies were evaluated for TS determination (the AOAC International method 996.11 vs. the modified glucoamylase method) and DG determination (the modified glucoamylase method vs. differential scanning calorimetry [DSC]). For TS determination, the AOAC International method 996.11 measured lower TS values in cold pellets compared with the modified glucoamylase method. The AOAC International method resulted in lower TS in cold mash than cooled pelletized feed, whereas the modified glucoamylase method showed no significant differences in TS content before or after pelleting. For DG, the modified glucoamylase method demonstrated increased DG with each processing step. Furthermore, increasing the conditioning temperature and time resulted in a greater DG when evaluated by the modified glucoamylase method. However, results demonstrated that DSC is not suitable as a quantitative tool for determining DG in multicomponent animal feeds due to interferences from nonstarch transformations, such as protein denaturation.
Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große
2012-05-01
This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health.
Concurrent analysis: towards generalisable qualitative research.
Snowden, Austyn; Martin, Colin R
2011-10-01
This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.
Measuring attitudes towards the dying process: A systematic review of tools.
Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond
2018-04-01
At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.
Sampling of tar from sewage sludge gasification using solid phase adsorption.
Ortiz González, Isabel; Pérez Pastor, Rosa Ma; Sánchez Hervás, José Ma
2012-06-01
Sewage sludge is a residue from wastewater treatment plants which is considered to be harmful to the environment and all living organisms. Gasification technology is a potential source of renewable energy that converts the sewage sludge into gases that can be used to generate energy or as raw material in chemical synthesis processes. But tar produced during gasification is one of the problems for the implementation of the gasification technology. Tar can condense on pipes and filters and may cause blockage and corrosion in the engines and turbines. Consequently, to minimize tar content in syngas, the ability to quantify tar levels in process streams is essential. The aim of this work was to develop an accurate tar sampling and analysis methodology using solid phase adsorption (SPA) in order to apply it to tar sampling from sewage sludge gasification gases. Four types of commercial SPA cartridges have been tested to determine the most suitable one for the sampling of individual tar compounds in such streams. Afterwards, the capacity, breakthrough volume and sample stability of the Supelclean™ ENVI-Carb/NH(2), which is identified as the most suitable, have been determined. Basically, no significant influences from water, H(2)S or NH(3) were detected. The cartridge was used in sampling real samples, and comparable results were obtained with the present and traditional methods.
NASA Astrophysics Data System (ADS)
Torremorell, Maria Carme Boqué; de Nicolás, Montserrat Alguacil; Valls, Mercè Pañellas
Teacher training at the Blanquerna Faculty of Psychology and Educational and Sports Sciences (FPCEE), in Barcelona, has a long pedagogical tradition based on teaching innovation. Its educational style is characterised by methods focused on the students' involvement and on close collaboration with teaching practice centres. Within a core subject in the Teacher Training diploma course, students were asked to assess different methodological proposals aimed at promoting the development of their personal, social, and professional competences. In the assessment surveys, from a sample of 145 students, scores for variables very satisfactory or satisfactory ranged from 95.8 % to 83.4 % for the entire set of methodological actions under analysis. Data obtained in this first research phase were very useful to design basic training modules for the new Teacher Training Degree. In the second phase (in process), active teachers are asked for their perception on the orientation of the practicum, its connection with the end-of-course assignment, and the in-service student's incidence on innovation processes at school.
Peptide biomarkers as a way to determine meat authenticity.
Sentandreu, Miguel Angel; Sentandreu, Enrique
2011-11-01
Meat fraud implies many illegal procedures affecting the composition of meat and meat products, something that is commonly done with the aim to increase profit. These practices need to be controlled by legal authorities by means of robust, accurate and sensitive methodologies capable to assure that fraudulent or accidental mislabelling does not arise. Common strategies traditionally used to assess meat authenticity have been based on methods such as chemometric analysis of a large set of data analysis, immunoassays or DNA analysis. The identification of peptide biomarkers specific of a particular meat species, tissue or ingredient by proteomic technologies constitutes an interesting and promising alternative to existing methodologies due to its high discriminating power, robustness and sensitivity. The possibility to develop standardized protein extraction protocols, together with the considerably higher resistance of peptide sequences to food processing as compared to DNA sequences, would overcome some of the limitations currently existing for quantitative determinations of highly processed food samples. The use of routine mass spectrometry equipment would make the technology suitable for control laboratories. Copyright © 2011 Elsevier Ltd. All rights reserved.
Ajala, E O; Aberuagba, F; Olaniyan, A M; Onifade, K R
2016-01-01
Shea butter (SB) was extracted from its kernel by using n-hexane as solvent in an optimization study. This was to determine the optima operating variables that would give optimum yield of SB and to study the effect of solvent on the physico-chemical properties and chemical composition of SB extracted using n-hexane. A Box-behnken response surface methodology (RSM) was used for the optimization study while statistical analysis using ANOVA was used to test the significance of the variables for the process. The variables considered for this study were: sample weight (g), solvent volume (ml) and extraction time (min). The physico-chemical properties of SB extracted were determined using standard methods and Fourier Transform Infrared Spectroscopy (FTIR) for the chemical composition. The results of RSM analysis showed that the three variables investigated have significant effect (p < 0.05) on the %yield of SB, with R(2) - 0.8989 which showed good fitness of a second-order model. Based on this model, optima operating variables for the extraction process were established as: sample weight of 30.04 g, solvent volume of 346.04 ml and extraction time of 40 min, which gave 66.90 % yield of SB. Furthermore, the result of the physico-chemical properties obtained for the shea butter extracted using traditional method (SBT) showed that it is a more suitable raw material for food, biodiesel production, cosmetics, medicinal and pharmaceutical purposes than shea butter extracted using solvent extraction method (SBS). Fourier Transform Infrared Spectroscopy (FTIR) results obtained for the two samples were similar to what was obtainable from other vegetable oil.
Methods of human body odor sampling: the effect of freezing.
Lenochova, Pavlina; Roberts, S Craig; Havlicek, Jan
2009-02-01
Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing. Although these differences might have a critical impact on results obtained, almost no studies test validity of current methods. Here, we focused on the effect of freezing samples between collection and use in experiments involving body odor perception. In 2 experiments, we tested whether axillary odors were perceived differently by raters when presented fresh or having been frozen and whether several freeze-thaw cycles affected sample quality. In the first experiment, samples were frozen for 2 weeks, 1 month, or 4 months. We found no differences in ratings of pleasantness, attractiveness, or masculinity between fresh and frozen samples. Similarly, almost no differences between repeatedly thawed and fresh samples were found. We found some variations in intensity; however, this was unrelated to length of storage. The second experiment tested differences between fresh samples and those frozen for 6 months. Again no differences in subjective ratings were observed. These results suggest that freezing has no significant effect on perceived odor hedonicity and that samples can be reliably used after storage for relatively long periods.
NASA Astrophysics Data System (ADS)
Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.
2016-08-01
An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
Accounting for Uncertainties in Strengths of SiC MEMS Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.
2007-01-01
A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
Automated high-throughput protein purification using an ÄKTApurifier and a CETAC autosampler.
Yoo, Daniel; Provchy, Justin; Park, Cynthia; Schulz, Craig; Walker, Kenneth
2014-05-30
As the pace of drug discovery accelerates there is an increased focus on screening larger numbers of protein therapeutic candidates to identify those that are functionally superior and to assess manufacturability earlier in the process. Although there have been advances toward high throughput (HT) cloning and expression, protein purification is still an area where improvements can be made to conventional techniques. Current methodologies for purification often involve a tradeoff between HT automation or capacity and quality. We present an ÄKTA combined with an autosampler, the ÄKTA-AS, which has the capability of purifying up to 240 samples in two chromatographic dimensions without the need for user intervention. The ÄKTA-AS has been shown to be reliable with sample volumes between 0.5 mL and 100 mL, and the innovative use of a uniquely configured loading valve ensures reliability by efficiently removing air from the system as well as preventing sample cross contamination. Incorporation of a sample pump flush minimizes sample loss and enables recoveries ranging from the low tens of micrograms to milligram quantities of protein. In addition, when used in an affinity capture-buffer exchange format the final samples are formulated in a buffer compatible with most assays without requirement of additional downstream processing. The system is designed to capture samples in 96-well microplate format allowing for seamless integration of downstream HT analytic processes such as microfluidic or HPLC analysis. Most notably, there is minimal operator intervention to operate this system, thereby increasing efficiency, sample consistency and reducing the risk of human error. Copyright © 2014 Elsevier B.V. All rights reserved.
Methodological Considerations for Hair Cortisol Measurements in Children
Slominski, Radomir; Rovnaghi, Cynthia R.; Anand, Kanwaljeet J. S.
2015-01-01
Background Hair cortisol levels are used increasingly as a measure for chronic stress in young children. We propose modifications to the current methods used for hair cortisol analysis to more accurately determine reference ranges for hair cortisol across different populations and age groups. Methods The authors compared standard (finely cutting hair) vs. milled methods for hair processing (n=16), developed a 4-step extraction process for hair protein and cortisol (n=16), and compared liquid chromatography-mass spectrometry (LCMS) vs. ELISA assays for measuring hair cortisol (n=28). The extraction process included sequential incubations in methanol and acetone, repeated twice. Hair protein was measured via spectrophotometric ratios at 260/280 nm to indicate the hair dissolution state using a BioTek® plate reader and dedicated software. Hair cortisol was measured using an ELISA assay kit. Individual (n=13), pooled hair samples (n=12) with high, intermediate, and low cortisol values and the ELISA assay internal standards (n=3) were also evaluated by LCMS. Results Milled and standard methods showed highly correlated hair cortisol (rs=0.951, p<0.0001) and protein values (rs=0.902, p=0.0002), although higher yields of cortisol and protein were obtained from the standard method in 13/16 and 14/16 samples respectively (p<0.05). Four sequential extractions yielded additional amounts of protein (36.5%, 27.5%, 30.5%, 3.1%) and cortisol (45.4%, 31.1%, 15.1%, 0.04%) from hair samples. Cortisol values measured by LCMS and ELISA were correlated (rs=0.737; p<0.0001), although cortisol levels (median [IQR]) detected in the same samples by LCMS (38.7 [14.4, 136] ng/ml) were lower than by ELISA (172.2 [67.9, 1051] ng/ml). LCMS also detected cortisone, which comprised 13.4% (3.7%, 25.9%) of the steroids detected. Conclusion Methodological studies suggest that finely cutting hair with sequential incubations in methanol and acetone, repeated twice, extracts greater yields of cortisol than does milled hair. Based on these findings, at least three incubations may be required to extract most of the cortisol in human hair samples. In addition, ELISA-based assays showed greater sensitivity for measuring hair cortisol levels than LCMS-based assays. PMID:25811341
Credit risk migration rates modeling as open systems: A micro-simulation approach
NASA Astrophysics Data System (ADS)
Landini, S.; Uberti, M.; Casellina, S.
2018-05-01
The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.
Yuksel, Ferhat; Karaman, Safa; Kayacier, Ahmed
2014-02-15
In the present study, wheat chips enriched with flaxseed flour were produced and response surface methodology was used for the studying the simultaneous effects of flaxseed level (10-20%), frying temperature (160-180 °C) and frying time (40-60 s) on some physicochemical, textural and sensorial properties and fatty acid composition of wheat chips. Ridge analysis was conducted to determine the optimum levels of processing variables. Predictive regression equations with adequate coefficients of determination (R² ≥ 0.705) to explain the effect of processing variables were constructed. Addition of flaxseed flour increased the dry matter and protein content of samples and increase of frying temperature decreased the hardness values of wheat chips samples. Increment in flaxseed level provided an increase in unsaturated fatty acid content namely omega-3 fatty acids of wheat chips samples. Overall acceptability of chips increased with the increase of frying temperature. Ridge analysis showed that maximum taste score would be at flaxseed level = 10%, frying temperature = 180 °C and frying time = 50 s. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Di Anibal, Carolina V.; Marsal, Lluís F.; Callao, M. Pilar; Ruisánchez, Itziar
2012-02-01
Raman spectroscopy combined with multivariate analysis was evaluated as a tool for detecting Sudan I dye in culinary spices. Three Raman modalities were studied: normal Raman, FT-Raman and SERS. The results show that SERS is the most appropriate modality capable of providing a proper Raman signal when a complex matrix is analyzed. To get rid of the spectral noise and background, Savitzky-Golay smoothing with polynomial baseline correction and wavelet transform were applied. Finally, to check whether unadulterated samples can be differentiated from samples adulterated with Sudan I dye, an exploratory analysis such as principal component analysis (PCA) was applied to raw data and data processed with the two mentioned strategies. The results obtained by PCA show that Raman spectra need to be properly treated if useful information is to be obtained and both spectra treatments are appropriate for processing the Raman signal. The proposed methodology shows that SERS combined with appropriate spectra treatment can be used as a practical screening tool to distinguish samples suspicious to be adulterated with Sudan I dye.
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P; Kumar, Ambuj
2012-06-01
To assess whether the reported methodological quality of randomized controlled trials (RCTs) reflects the actual methodological quality and to evaluate the association of effect size (ES) and sample size with methodological quality. Systematic review. This is a retrospective analysis of all consecutive phase III RCTs published by eight National Cancer Institute Cooperative Groups up to 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Four hundred twenty-nine RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94; 95% confidence interval [CI]: 0.88, 0.99) and 24% (RHR: 1.24; 95% CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. The largest study to date shows that poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. Copyright © 2012 Elsevier Inc. All rights reserved.
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
Auditing as part of the terminology design life cycle.
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology's concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert's manual review on portions of the concepts with a high likelihood of errors.
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
ESS Cryogenic System Process Design
NASA Astrophysics Data System (ADS)
Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II
2015-12-01
The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.
Persisting mathematics and science high school teachers: A Q-methodology study
NASA Astrophysics Data System (ADS)
Robbins-Lavicka, Michelle M.
There is a lack of qualified mathematics and science teachers at all levels of education in Arkansas. Lasting teaching initiative programs are needed to address retention so qualified teachers remain in the classroom. The dearth of studies regarding why mathematics and science teachers persist in the classroom beyond the traditional 5-year attrition period led this Q-methodological study to evaluate the subjective perceptions of persistent mathematics and science teachers to determine what makes them stay. This study sought to understand what factors persisting mathematics and science teachers used to explain their persistence in the classroom beyond 5 years and what educational factors contributed to persisting mathematics and science teachers. Q-methodology combines qualitative and quantitative techniques and provided a systematic means to investigate personal beliefs by collecting a concourse, developing a Q-sample and a person-sample, conducting a Q-sorting process, and analyzing the data. The results indicated that to encourage longevity within mathematics and science classrooms (a) teachers should remain cognizant of their ability to influence student attitudes toward teaching; (b) administrators should provide support for teachers and emphasize the role and importance of professional development; and (c) policy makers should focus their efforts and resources on developing recruitment plans, including mentorship programs, while providing and improving financial compensation. Significantly, the findings indicate that providing mentorship and role models at every level of mathematics and science education will likely encourage qualified teachers to remain in the mathematics and science classrooms, thus increasing the chance of positive social change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekar, Kursat B; Miller, Thomas Martin; Patton, Bruce W
The characteristic X-rays produced by the interactions of the electron beam with the sample in a scanning electron microscope (SEM) are usually captured with a variable-energy detector, a process termed energy dispersive spectrometry (EDS). The purpose of this work is to exploit inverse simulations of SEM-EDS spectra to enable rapid determination of sample properties, particularly elemental composition. This is accomplished using penORNL, a modified version of PENELOPE, and a modified version of the traditional Levenberg Marquardt nonlinear optimization algorithm, which together is referred to as MOZAIK-SEM. The overall conclusion of this work is that MOZAIK-SEM is a promising method formore » performing inverse analysis of X-ray spectra generated within a SEM. As this methodology exists now, MOZAIK-SEM has been shown to calculate the elemental composition of an unknown sample within a few percent of the actual composition.« less
Validation of Rapid Radiochemical Method for Californium ...
Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.
Wiśniewska, Paulina; Boqué, Ricard; Borràs, Eva; Busto, Olga; Wardencki, Waldemar; Namieśnik, Jacek; Dymerski, Tomasz
2017-02-15
Headspace mass-spectrometry (HS-MS), mid infrared (MIR) and UV-vis spectroscopy were used to authenticate whisky samples from different origins and ways of production ((Irish, Spanish, Bourbon, Tennessee Whisky and Scotch). The collected spectra were processed with partial least-squares discriminant analysis (PLS-DA) to build the classification models. In all cases the five groups of whiskies were distinguished, but the best results were obtained by HS-MS, which indicates that the biggest differences between different types of whisky are due to their aroma. Differences were also found inside groups, showing that not only raw material is important to discriminate samples but also the way of their production. The methodology is quick, easy and does not require sample preparation. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Reddy, Vijeth V.; Vedantha Krishna, Amogh; Schultheiss, Fredrik; Rosén, B.-G.
2017-06-01
Manufactured surfaces usually consist of topographical features which include both those put forth by the manufacturing process, and micro-features caused by disturbances during this process. Surface characterization basically involves study of these features which influence the functionality of the surface. This article focuses on characterization of the surface topography of machined lead brass and lead free brass. The adverse effect of lead on human health and the environment has led the manufacturing sector to focus on sustainable manufacturing of lead free brass, as well as how to maintain control of the surface integrity when substituting the lead content in the brass with silicon. The investigation includes defined areal surface parameters measured on the turned samples of lead- and lead free brass using an optical coherence scanning interferometer, CSI. This paper deals with the study of surface topography of turned samples of lead- and lead free brass. It is important to study the topographical characteristics of the brass samples which are the intermediate link between the manufacturing process variables and the functional behaviour of the surface. To numerically evaluate the sample’s surface topography and to validate the measurements for a significant study, a general statistical methodology is implemented. The results indicate higher surface roughness in turned samples of lead brass compared to lead free brass.
Hensman, James; Lawrence, Neil D; Rattray, Magnus
2013-08-20
Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.
Bovea, María D; Ibáñez-Forés, Valeria; Pérez-Belis, Victoria; Quemades-Beltrán, Pilar
2016-07-01
This study proposes a general methodology for assessing and estimating the potential reuse of small waste electrical and electronic equipment (sWEEE), focusing on devices classified as domestic appliances. Specific tests for visual inspection, function and safety have been defined for ten different types of household appliances (vacuum cleaner, iron, microwave, toaster, sandwich maker, hand blender, juicer, boiler, heater and hair dryer). After applying the tests, reuse protocols have been defined in the form of easy-to-apply checklists for each of the ten types of appliance evaluated. This methodology could be useful for reuse enterprises, since there is a lack of specific protocols, adapted to each type of appliance, to test its potential of reuse. After applying the methodology, electrical and electronic appliances (used or waste) can be segregated into three categories: the appliance works properly and can be classified as direct reuse (items can be used by a second consumer without prior repair operations), the appliance requires a later evaluation of its potential refurbishment and repair (restoration of products to working order, although with possible loss of quality) or the appliance needs to be finally discarded from the reuse process and goes directly to a recycling process. Results after applying the methodology to a sample of 87.7kg (96 units) show that 30.2% of the appliances have no potential for reuse and should be diverted for recycling, while 67.7% require a subsequent evaluation of their potential refurbishment and repair, and only 2.1% of them could be directly reused with minor cleaning operations. This study represents a first approach to the "preparation for reuse" strategy that the European Directive related to Waste Electrical and Electronic Equipment encourages to be applied. However, more research needs to be done as an extension of this study, mainly related to the identification of the feasibility of repair or refurbishment operations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Direct PCR amplification of forensic touch and other challenging DNA samples: A review.
Cavanaugh, Sarah E; Bathrick, Abigail S
2018-01-01
DNA evidence sample processing typically involves DNA extraction, quantification, and STR amplification; however, DNA loss can occur at both the DNA extraction and quantification steps, which is not ideal for forensic evidence containing low levels of DNA. Direct PCR amplification of forensic unknown samples has been suggested as a means to circumvent extraction and quantification, thereby retaining the DNA typically lost during those procedures. Direct PCR amplification is a method in which a sample is added directly to an amplification reaction without being subjected to prior DNA extraction, purification, or quantification. It allows for maximum quantities of DNA to be targeted, minimizes opportunities for error and contamination, and reduces the time and monetary resources required to process samples, although data analysis may take longer as the increased DNA detection sensitivity of direct PCR may lead to more instances of complex mixtures. ISO 17025 accredited laboratories have successfully implemented direct PCR for limited purposes (e.g., high-throughput databanking analysis), and recent studies indicate that direct PCR can be an effective method for processing low-yield evidence samples. Despite its benefits, direct PCR has yet to be widely implemented across laboratories for the processing of evidentiary items. While forensic DNA laboratories are always interested in new methods that will maximize the quantity and quality of genetic information obtained from evidentiary items, there is often a lag between the advent of useful methodologies and their integration into laboratories. Delayed implementation of direct PCR of evidentiary items can be attributed to a variety of factors, including regulatory guidelines that prevent laboratories from omitting the quantification step when processing forensic unknown samples, as is the case in the United States, and, more broadly, a reluctance to validate a technique that is not widely used for evidence samples. The advantages of direct PCR of forensic evidentiary samples justify a re-examination of the factors that have delayed widespread implementation of this method and of the evidence supporting its use. In this review, the current and potential future uses of direct PCR in forensic DNA laboratories are summarized. Copyright © 2017 Elsevier B.V. All rights reserved.
In situ 2D diffraction as a tool to characterize ferroelectric and piezoelectric thin films
NASA Astrophysics Data System (ADS)
Khamidy, N. I.; Kovacova, V.; Bernasconi, A.; Le Rhun, G.; Vaxelaire, N.
2017-08-01
In this paper the application of 2D x-ray diffraction (XRD2) as a technique to characterize in situ during electrical cycling the properties of a ferroelectric and piezoelectric thin film is discussed. XRD2 is one type of XRD on which a 2D detector is used instead of a point detector. This technique enables simultaneous recording of many sample information in a much shorter time compared to conventional XRD. The discussion is focused especially on the data processing technique of the huge data acquired. The methodology to calculate an effective piezoelectric coefficient, analyze the phase and texture, and estimate the domain size and shape is described in this paper. This methodology is then applied to a lead zirconate titanate (PZT) thin film at the morphotropic phase boundary (MPB) composition (i.e. Pb[Zr0.52Ti0.48]O3) with a preferred orientation of (1 0 0). The in situ XRD2 characterization was conducted in the European synchrotron radiation facility (ESRF) in Grenoble, France. Since a high-energy beam with vertical resolution as small as 100 nm was used, a cross-sectional scan of the sample was performed over the entire thickness of the film. From these experimental results, a better understanding on the piezoelectricity phenomena in PZT thin film at MPB composition were achieved, providing original feedback between the elaboration processes and functional properties of the film.
Quality Assessment of TPB-Based Questionnaires: A Systematic Review
Oluka, Obiageli Crystal; Nie, Shaofa; Sun, Yi
2014-01-01
Objective This review is aimed at assessing the quality of questionnaires and their development process based on the theory of planned behavior (TPB) change model. Methods A systematic literature search for studies with the primary aim of TPB-based questionnaire development was conducted in relevant databases between 2002 and 2012 using selected search terms. Ten of 1,034 screened abstracts met the inclusion criteria and were assessed for methodological quality using two different appraisal tools: one for the overall methodological quality of each study and the other developed for the appraisal of the questionnaire content and development process. Both appraisal tools consisted of items regarding the likelihood of bias in each study and were eventually combined to give the overall quality score for each included study. Results 8 of the 10 included studies showed low risk of bias in the overall quality assessment of each study, while 9 of the studies were of high quality based on the quality appraisal of questionnaire content and development process. Conclusion Quality appraisal of the questionnaires in the 10 reviewed studies was successfully conducted, highlighting the top problem areas (including: sample size estimation; inclusion of direct and indirect measures; and inclusion of questions on demographics) in the development of TPB-based questionnaires and the need for researchers to provide a more detailed account of their development process. PMID:24722323
Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver
2017-08-01
Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne
2016-06-01
There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Mueller, A. V.; Hemond, H.
2009-12-01
The capability for comprehensive, real-time, in-situ characterization of the chemical constituents of natural waters is a powerful tool for the advancement of the ecological and geochemical sciences, e.g. by facilitating rapid high-resolution adaptive sampling campaigns and avoiding the potential errors and high costs related to traditional grab sample collection, transportation and analysis. Portable field-ready instrumentation also promotes the goals of large-scale monitoring networks, such as CUASHI and WATERS, without the financial and human resources overhead required for traditional sampling at this scale. Problems of environmental remediation and monitoring of industrial waste waters would additionally benefit from such instrumental capacity. In-situ measurement of all major ions contributing to the charge makeup of natural fresh water is thus pursued via a combined multi-sensor/multivariate signal processing architecture. The instrument is based primarily on commercial electrochemical sensors, e.g. ion selective electrodes (ISEs) and ion selective field-effect transistors (ISFETs), to promote low cost as well as easy maintenance and reproduction,. The system employs a novel architecture of multivariate signal processing to extract accurate information from in-situ data streams via an "unmixing" process that accounts for sensor non-linearities at low concentrations, as well as sensor cross-reactivities. Conductivity, charge neutrality and temperature are applied as additional mathematical constraints on the chemical state of the system. Including such non-ionic information assists in obtaining accurate and useful calibrations even in the non-linear portion of the sensor response curves, and measurements can be made without the traditionally-required standard additions or ionic strength adjustment. Initial work demonstrates the effectiveness of this methodology at predicting inorganic cations (Na+, NH4+, H+, Ca2+, and K+) in a simplified system containing only a single anion (Cl-) in addition to hydroxide, thus allowing charge neutrality to be easily and explicitly invoked. Calibration of every probe relative to each of the five cations present is undertaken, and resulting curves are used to create a representative environmental data set based on USGS data for New England waters. Signal processing methodologies, specifically artificial neural networks (ANNs), are extended to use a feedback architecture based on conductivity measurements and charge neutrality calculations. The algorithms are then tuned to optimize performance of the algorithm at predicting actual concentrations from these simulated signals. Results are compared to use of component probes as stand-alone sensors. Future extension of this instrument for multiple anions (including carbonate and bicarbonate, nitrate, and sulfate) will ultimately provide rapid, accurate field measurements of the entire charge balance of natural waters at high resolution, improving sampling abilities while reducing costs and errors related to transport and analysis of grab samples.
1987-08-01
out. To use each animal as its own control , arterial blood was sampled by means of chronically implanted aortic cannulas 112,13,14]. This simple...APPENDIX B STATISTICAL METHODOLOGY 37 APPENDIX B STATISTICAL METHODOLOGY The balanced design of this experiment (requiring that 25 animals from each...protoccl in that, in numerous cases, samples were collected at odd intervals (invalidating the orthogonality of the design ) and the number of samples’taken
Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes
ERIC Educational Resources Information Center
Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.
2016-01-01
The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
A Matrix Approach to Software Process Definition
NASA Technical Reports Server (NTRS)
Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)
2000-01-01
The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.
[An approach to a methodology of scientific research for assistant-students].
Novak, Ivón T C; Bejarano, Paola Antón; Rodríguez, Fernando Marcos
2007-01-01
This work is presented from a "problematic" perspective in the attempt to establish a dialogic relationship between the educator and the student-subject, mediated by the object of knowledge. It is oriented to the integral education of the helping students departing from a closer approach to the scientific research. This work was carried out by a teacher and two hired students. This project was developed in relation with the profile required for the career of medicine in the Faculty of Medicine of the National University of Cordoba which--among other aspects- addresses the importance of "adopting a positive attitude towards research based on knowledge and the application of the scientific methodology" and towards "the development of a responsible self-learning and continuous improvements" (sic). Thus, this work tries to be aligned with this perspectives. I. Characterization of the scientific methodology. Search for bibliography and discussion of scientific works. II. Optimization of the methodology for the observation of leucocytes: blood samples donated by healthy people, non-coagulating with citrate or with EDTA (Blood reservoir of the UNC (National University of Cordoba) n = 20. a) Blood smear of full blood. b) centrifugation at 200g of plasma and aspirated leucocytes after erythro sedimentation and re suspension of the cell pellet and cyto-dispersion. Cytological and cyto-chemical techniques. I. Deeper knowledge about blood field was achieved. It generated an appropriate atmosphere to produce scientific questioning and the activities involved in the process were carried out responsibly. II. Better results were achieved using EDTA for the observation and analysis of leucocytes. It was possible to attain the objectives for an approach to a scientific research as well as for a contribution towards a responsible development in the continuous learning process.
Gómez, Javier B; Gimeno, María J; Auqué, Luis F; Acero, Patricia
2014-01-15
This paper presents the mixing modelling results for the hydrogeochemical characterisation of groundwaters in the Laxemar area (Sweden). This area is one of the two sites that have been investigated, under the financial patronage of the Swedish Nuclear Waste and Management Co. (SKB), as possible candidates for hosting the proposed repository for the long-term storage of spent nuclear fuel. The classical geochemical modelling, interpreted in the light of the palaeohydrogeological history of the system, has shown that the driving process in the geochemical evolution of this groundwater system is the mixing between four end-member waters: a deep and old saline water, a glacial meltwater, an old marine water, and a meteoric water. In this paper we put the focus on mixing and its effects on the final chemical composition of the groundwaters using a comprehensive methodology that combines principal component analysis with mass balance calculations. This methodology allows us to test several combinations of end member waters and several combinations of compositional variables in order to find optimal solutions in terms of mixing proportions. We have applied this methodology to a dataset of 287 groundwater samples from the Laxemar area collected and analysed by SKB. The best model found uses four conservative elements (Cl, Br, oxygen-18 and deuterium), and computes mixing proportions with respect to three end member waters (saline, glacial and meteoric). Once the first order effect of mixing has been taken into account, water-rock interaction can be used to explain the remaining variability. In this way, the chemistry of each water sample can be obtained by using the mixing proportions for the conservative elements, only affected by mixing, or combining the mixing proportions and the chemical reactions for the non-conservative elements in the system, establishing the basis for predictive calculations. © 2013 Elsevier B.V. All rights reserved.
A non-linear dimension reduction methodology for generating data-driven stochastic input models
NASA Astrophysics Data System (ADS)
Ganapathysubramanian, Baskar; Zabaras, Nicholas
2008-06-01
Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low-dimensional input stochastic models to represent thermal diffusivity in two-phase microstructures. This model is used in analyzing the effect of topological variations of two-phase microstructures on the evolution of temperature in heat conduction processes.
Study of jojoba oil aging by FTIR.
Le Dréau, Y; Dupuy, N; Gaydou, V; Joachim, J; Kister, J
2009-05-29
As the jojoba oil was used in cosmetic, pharmaceutical, dietetic food, animal feeding, lubrication, polishing and bio-diesel fields, it was important to study its aging at high temperature by oxidative process. In this work a FT-MIR methodology was developed for monitoring accelerate oxidative degradation of jojoba oils. Principal component analysis (PCA) was used to differentiate various samples according to their origin and obtaining process, and to differentiate oxidative conditions applied on oils. Two spectroscopic indices were calculated to report simply the oxidation phenomenon. Results were confirmed and deepened by multivariate curve resolution-alternative least square method (MCR-ALS). It allowed identifying chemical species produced or degraded during the thermal treatment according to a SIMPLISMA pretreatment.
NASA Technical Reports Server (NTRS)
1980-01-01
The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.
Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason
2001-01-01
Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...
Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P
2018-01-01
Background Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test’s performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. Methods and findings We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Cholrapid and Cholgold) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). Conclusion No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard. PMID:29387424
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-23
... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...
Critical Thinking: Comparing Instructional Methodologies in a Senior-Year Learning Community
ERIC Educational Resources Information Center
Zelizer, Deborah A.
2013-01-01
This quasi-experimental, nonequivalent control group study compared the impact of Ennis's (1989) mixed instructional methodology to the immersion methodology on the development of critical thinking in a multicultural, undergraduate senior-year learning community. A convenience sample of students (n =171) were selected from four sections of a…
Multi-laboratory survey of qPCR enterococci analysis method performance
Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr
DREAM: An Efficient Methodology for DSMC Simulation of Unsteady Processes
NASA Astrophysics Data System (ADS)
Cave, H. M.; Jermy, M. C.; Tseng, K. C.; Wu, J. S.
2008-12-01
A technique called the DSMC Rapid Ensemble Averaging Method (DREAM) for reducing the statistical scatter in the output from unsteady DSMC simulations is introduced. During post-processing by DREAM, the DSMC algorithm is re-run multiple times over a short period before the temporal point of interest thus building up a combination of time- and ensemble-averaged sampling data. The particle data is regenerated several mean collision times before the output time using the particle data generated during the original DSMC run. This methodology conserves the original phase space data from the DSMC run and so is suitable for reducing the statistical scatter in highly non-equilibrium flows. In this paper, the DREAM-II method is investigated and verified in detail. Propagating shock waves at high Mach numbers (Mach 8 and 12) are simulated using a parallel DSMC code (PDSC) and then post-processed using DREAM. The ability of DREAM to obtain the correct particle velocity distribution in the shock structure is demonstrated and the reduction of statistical scatter in the output macroscopic properties is measured. DREAM is also used to reduce the statistical scatter in the results from the interaction of a Mach 4 shock with a square cavity and for the interaction of a Mach 12 shock on a wedge in a channel.
Ebshish, Ali; Yaakob, Zahira; Taufiq-Yap, Yun Hin; Bshish, Ahmed
2014-03-19
In this work; a response surface methodology (RSM) was implemented to investigate the process variables in a hydrogen production system. The effects of five independent variables; namely the temperature (X₁); the flow rate (X₂); the catalyst weight (X₃); the catalyst loading (X₄) and the glycerol-water molar ratio (X₅) on the H₂ yield (Y₁) and the conversion of glycerol to gaseous products (Y₂) were explored. Using multiple regression analysis; the experimental results of the H₂ yield and the glycerol conversion to gases were fit to quadratic polynomial models. The proposed mathematical models have correlated the dependent factors well within the limits that were being examined. The best values of the process variables were a temperature of approximately 600 °C; a feed flow rate of 0.05 mL/min; a catalyst weight of 0.2 g; a catalyst loading of 20% and a glycerol-water molar ratio of approximately 12; where the H₂ yield was predicted to be 57.6% and the conversion of glycerol was predicted to be 75%. To validate the proposed models; statistical analysis using a two-sample t -test was performed; and the results showed that the models could predict the responses satisfactorily within the limits of the variables that were studied.
Gervais, Gaël; Bichon, Emmanuelle; Antignac, Jean-Philippe; Monteau, Fabrice; Leroy, Gaëla; Barritaud, Lauriane; Chachignon, Mathilde; Ingrand, Valérie; Roche, Pascal; Le Bizec, Bruno
2011-06-01
The detection and structural elucidation of micropollutants treatment by-products are major issues to estimate efficiencies of the processes employed for drinking water production versus endocrine disruptive compounds contamination. This issue was mainly investigated at the laboratory scale and in high concentration conditions. However, potential by-products generated after chlorination can be influenced by the dilution factor employed in real conditions. The present study proposes a new methodology borrowed to the metabolomic science, using liquid chromatography coupled to high-resolution mass spectrometry, in order to reveal potential chlorination by-products of ethinylestradiol in spiked real water samples at the part-per-billion level (5 μg L(-1)). Conventional targeted measurements first demonstrated that chlorination with sodium hypochlorite (0.8 mg L(-1)) led to removals of ethinylestradiol over 97%. Then, the developed differential global profiling approach permitted to reveal eight chlorination by-products of EE2, six of them being described for the first time. Among these eight halogenated compounds, five have been structurally identified, demonstrating the potential capabilities of this new methodology applied to environmental samples. Copyright © 2011 Elsevier Ltd. All rights reserved.
Klink, Vincent P.; Overall, Christopher C.; Alkharouf, Nadim W.; MacDonald, Margaret H.; Matthews, Benjamin F.
2010-01-01
Background. A comparative microarray investigation was done using detection call methodology (DCM) and differential expression analyses. The goal was to identify genes found in specific cell populations that were eliminated by differential expression analysis due to the nature of differential expression methods. Laser capture microdissection (LCM) was used to isolate nearly homogeneous populations of plant root cells. Results. The analyses identified the presence of 13,291 transcripts between the 4 different sample types. The transcripts filtered down into a total of 6,267 that were detected as being present in one or more sample types. A comparative analysis of DCM and differential expression methods showed a group of genes that were not differentially expressed, but were expressed at detectable amounts within specific cell types. Conclusion. The DCM has identified patterns of gene expression not shown by differential expression analyses. DCM has identified genes that are possibly cell-type specific and/or involved in important aspects of plant nematode interactions during the resistance response, revealing the uniqueness of a particular cell population at a particular point during its differentiation process. PMID:20508855
Airport surveys at travel destinations--underutilized opportunities in travel medicine research?
Bauer, Irmgard L
2015-01-01
Research in destination airports, especially in resource-poor areas, allows unique immediate access to travelers at the conclusion of their trip. Response rates are high and the recall gap small. Trip-related health matters can be elicited relatively easily. An insight into travelers' decision-making processes on location would fill large gaps in our knowledge regarding travel health advice provision; yet, this approach is still much underutilized. Using PubMed, ScienceDirect, Google Scholar, and ProQuest, a review of the literature on airport surveys was conducted to determine where they were used, their response rates and purpose, and location-relevant methodological information. The lack of methodological guidelines in the reviewed literature resulted in recommendations for planning and conducting an airport survey at a destination airport. Millions of travelers in airports around the world represent an underutilized sample of potential study participants for topics that cannot be studied adequately in other settings. Benefiting from close cooperation between travel health professionals and airport authorities, researchers can expect not only large-scale convenience samples for surveys, but also opportunities to explore exciting and creative research topics to broaden our understanding of travel medicine and health. © 2014 International Society of Travel Medicine.
Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner
2013-09-01
This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.
Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.
Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C
2013-04-01
Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.
Evaluation of glucose controllers in virtual environment: methodology and sample application.
Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman
2004-11-01
Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.
RAS testing in metastatic colorectal cancer: advances in Europe.
Van Krieken, J Han J M; Rouleau, Etienne; Ligtenberg, Marjolijn J L; Normanno, Nicola; Patterson, Scott D; Jung, Andreas
2016-04-01
Personalized medicine shows promise for maximizing efficacy and minimizing toxicity of anti-cancer treatment. KRAS exon 2 mutations are predictive of resistance to epidermal growth factor receptor-directed monoclonal antibodies in patients with metastatic colorectal cancer. Recent studies have shown that broader RAS testing (KRAS and NRAS) is needed to select patients for treatment. While Sanger sequencing is still used, approaches based on various methodologies are available. Few CE-approved kits, however, detect the full spectrum of RAS mutations. More recently, "next-generation" sequencing has been developed for research use, including parallel semiconductor sequencing and reversible termination. These techniques have high technical sensitivities for detecting mutations, although the ideal threshold is currently unknown. Finally, liquid biopsy has the potential to become an additional tool to assess tumor-derived DNA. For accurate and timely RAS testing, appropriate sampling and prompt delivery of material is critical. Processes to ensure efficient turnaround from sample request to RAS evaluation must be implemented so that patients receive the most appropriate treatment. Given the variety of methodologies, external quality assurance programs are important to ensure a high standard of RAS testing. Here, we review technical and practical aspects of RAS testing for pathologists working with metastatic colorectal cancer tumor samples. The extension of markers from KRAS to RAS testing is the new paradigm for biomarker testing in colorectal cancer.
Multiple ligand simultaneous docking: orchestrated dancing of ligands in binding sites of protein.
Li, Huameng; Li, Chenglong
2010-07-30
Present docking methodologies simulate only one single ligand at a time during docking process. In reality, the molecular recognition process always involves multiple molecular species. Typical protein-ligand interactions are, for example, substrate and cofactor in catalytic cycle; metal ion coordination together with ligand(s); and ligand binding with water molecules. To simulate the real molecular binding processes, we propose a novel multiple ligand simultaneous docking (MLSD) strategy, which can deal with all the above processes, vastly improving docking sampling and binding free energy scoring. The work also compares two search strategies: Lamarckian genetic algorithm and particle swarm optimization, which have respective advantages depending on the specific systems. The methodology proves robust through systematic testing against several diverse model systems: E. coli purine nucleoside phosphorylase (PNP) complex with two substrates, SHP2NSH2 complex with two peptides and Bcl-xL complex with ABT-737 fragments. In all cases, the final correct docking poses and relative binding free energies were obtained. In PNP case, the simulations also capture the binding intermediates and reveal the binding dynamics during the recognition processes, which are consistent with the proposed enzymatic mechanism. In the other two cases, conventional single-ligand docking fails due to energetic and dynamic coupling among ligands, whereas MLSD results in the correct binding modes. These three cases also represent potential applications in the areas of exploring enzymatic mechanism, interpreting noisy X-ray crystallographic maps, and aiding fragment-based drug design, respectively. 2010 Wiley Periodicals, Inc.
Nagarajan, Mahesh B.; Huber, Markus B.; Schlossbauer, Thomas; Leinsinger, Gerda; Krol, Andrzej; Wismüller, Axel
2014-01-01
Objective While dimension reduction has been previously explored in computer aided diagnosis (CADx) as an alternative to feature selection, previous implementations of its integration into CADx do not ensure strict separation between training and test data required for the machine learning task. This compromises the integrity of the independent test set, which serves as the basis for evaluating classifier performance. Methods and Materials We propose, implement and evaluate an improved CADx methodology where strict separation is maintained. This is achieved by subjecting the training data alone to dimension reduction; the test data is subsequently processed with out-of-sample extension methods. Our approach is demonstrated in the research context of classifying small diagnostically challenging lesions annotated on dynamic breast magnetic resonance imaging (MRI) studies. The lesions were dynamically characterized through topological feature vectors derived from Minkowski functionals. These feature vectors were then subject to dimension reduction with different linear and non-linear algorithms applied in conjunction with out-of-sample extension techniques. This was followed by classification through supervised learning with support vector regression. Area under the receiver-operating characteristic curve (AUC) was evaluated as the metric of classifier performance. Results Of the feature vectors investigated, the best performance was observed with Minkowski functional ’perimeter’ while comparable performance was observed with ’area’. Of the dimension reduction algorithms tested with ’perimeter’, the best performance was observed with Sammon’s mapping (0.84 ± 0.10) while comparable performance was achieved with exploratory observation machine (0.82 ± 0.09) and principal component analysis (0.80 ± 0.10). Conclusions The results reported in this study with the proposed CADx methodology present a significant improvement over previous results reported with such small lesions on dynamic breast MRI. In particular, non-linear algorithms for dimension reduction exhibited better classification performance than linear approaches, when integrated into our CADx methodology. We also note that while dimension reduction techniques may not necessarily provide an improvement in classification performance over feature selection, they do allow for a higher degree of feature compaction. PMID:24355697
Botero-Coy, A M; Ibáñez, M; Sancho, J V; Hernández, F
2013-05-31
The determination of glyphosate (GLY) in soils is of great interest due to the widespread use of this herbicide and the need of assessing its impact on the soil/water environment. However, its residue determination is very problematic especially in soils with high organic matter content, where strong interferences are normally observed, and because of the particular physico-chemical characteristics of this polar/ionic herbicide. In the present work, we have improved previous LC-MS/MS analytical methodology reported for GLY and its main metabolite AMPA in order to be applied to "difficult" soils, like those commonly found in South-America, where this herbicide is extensively used in large areas devoted to soya or maize, among other crops. The method is based on derivatization with FMOC followed by LC-MS/MS analysis, using triple quadrupole. After extraction with potassium hydroxide, a combination of extract dilution, adjustment to appropriate pH, and solid phase extraction (SPE) clean-up was applied to minimize the strong interferences observed. Despite the clean-up performed, the use of isotope labelled glyphosate as internal standard (ILIS) was necessary for the correction of matrix effects and to compensate for any error occurring during sample processing. The analytical methodology was satisfactorily validated in four soils from Colombia and Argentina fortified at 0.5 and 5mg/kg. In contrast to most LC-MS/MS methods, where the acquisition of two transitions is recommended, monitoring all available transitions was required for confirmation of positive samples, as some of them were interfered by unknown soil components. This was observed not only for GLY and AMPA but also for the ILIS. Analysis by QTOF MS was useful to confirm the presence of interferent compounds that shared the same nominal mass of analytes as well as some of their main product ions. Therefore, the selection of specific transitions was crucial to avoid interferences. The methodology developed was applied to the analysis of 26 soils from different areas of Colombia and Argentina, and the method robustness was demonstrated by analysis of quality control samples along 4 months. Copyright © 2012 Elsevier B.V. All rights reserved.
Process improvement for regulatory analyses of custom-blend fertilizers.
Wegner, Keith A
2014-01-01
Chemical testing of custom-blend fertilizers is essential to ensure that the products meet the formulation requirements. For purposes of proper crop nutrition and consumer protection, regulatory oversight promotes compliance and particular attention to blending and formulation specifications. Analyses of custom-blend fertilizer products must be performed and reported within a very narrow window in order to be effective. The Colorado Department of Agriculture's Biochemistry Laboratory is an ISO 17025 accredited facility and conducts analyses of custom-blend fertilizer products primarily during the spring planting season. Using the Lean Six Sigma (LSS) process, the Biochemistry Laboratory has reduced turnaround times from as much as 45 days to as little as 3 days. The LSS methodology focuses on waste reduction through identifying: non-value-added steps, unneeded process reviews, optimization of screening and confirmatory analyses, equipment utilization, nonessential reporting requirements, and inefficient personnel deployment. Eliminating these non-value-added activities helped the laboratory significantly shorten turnaround time and reduce costs. Key improvement elements discovered during the LSS process included: focused sample tracking, equipment redundancy, strategic supply stocking, batch size optimization, critical sample paths, elimination of nonessential QC reviews, and more efficient personnel deployment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barefield Ii, James E; Clegg, Samuel M; Lopez, Leon N
2010-01-01
Advanced methodologies and improvements to current measurements techniques are needed to strengthen the effectiveness and efficiency of international safeguards. This need was recognized and discussed at a Technical Meeting on 'The Application of Laser Spectrometry Techniques in IAEA Safeguards' held at IAEA headquarters (September 2006). One of the principal recommendations from that meeting was the need to pursue the development of novel complementary access instrumentation based on Laser Induced Breakdown Spectroscopy (UBS) for the detection of gaseous and solid signatures and indicators of nuclear fuel cycle processes and associated materials'. Pursuant to this recommendation the Department of Safeguards (SG) undermore » the Division of Technical Support (SGTS) convened the 'Experts and Users Advisory Meeting on Laser Induced Breakdown Spectroscopy (LIBS) for Safeguards Applications' also held at IAEA headquarters (July 2008). This meeting was attended by 12 LlBS experts from the Czech Republic, the European Commission, France, the Republic of South Korea, the United States of America, Germany, the United Kingdom of Great Britain, Canada, and Northern Ireland. Following a presentation of the needs of the IAEA inspectors, the LIBS experts agreed that needs as presented could be partially or fully fulfilled using LIBS instrumentation. Inspectors needs were grouped into the following broad categories: (1) Improvements to in-field measurements/environmental sampling; (2) Monitoring status of activities in Hot Cells; (3) Verify status of activity at a declared facility via process monitoring; and (4) Need for pre-screening of environmental samples before analysis. The primary tool employed by the IAEA to detect undeclared processes and activities at special nuclear material facilities and sites is environmental sampling. One of the objectives of the Next Generation Safeguards Initiative (NGSI) Program Plan calls for the development of advanced tools and methodologies to detect and analyze undeclared processing or production of special nuclear material. Los Alamos National Laboratory is currently investigating potential uses of LIBS for safeguards applications, including (1) a user-friendly man-portable LIBS system to characterize samples in real to near-real time (typical analysis time are on the order of minutes) across a wide range of elements in the periodic table from hydrogen up to heavy elements like plutonium and uranium, (2) a LIBS system that can be deployed in harsh environments such as hot cells and glove boxes providing relative compositional analysis of process streams for example ratios like Cm/Up and Cm/U, (3) an inspector field deployable system that can be used to analyze the elemental composition of microscopic quantities of samples containing plutonium and uranium, and (4) a high resolution LIBS system that can be used to determine the isotopic composition of samples containing for example uranium, plutonium... etc. In this paper, we will describe our current development and performance testing results for LIBS instrumentation both in a fixed lab and measurements in field deployable configurations.« less
Springer, Jan; White, P Lewis; Hamilton, Shanna; Michel, Denise; Barnes, Rosemary A; Einsele, Hermann; Löffler, Juergen
2016-03-01
Standardized methodologies for the molecular detection of invasive aspergillosis (IA) have been established by the European Aspergillus PCR Initiative for the testing of whole blood, serum, and plasma. While some comparison of the performance of Aspergillus PCR when testing these different sample types has been performed, no single study has evaluated all three using the recommended protocols. Standardized Aspergillus PCR was performed on 423 whole-blood pellets (WBP), 583 plasma samples, and 419 serum samples obtained from hematology patients according to the recommendations. This analysis formed a bicenter retrospective anonymous case-control study, with diagnosis according to the revised European Organization for Research and Treatment of Cancer/Invasive Fungal Infections Cooperative Group and National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) consensus definitions (11 probable cases and 36 controls). Values for clinical performance using individual and combined samples were calculated. For all samples, PCR positivity was significantly associated with cases of IA (for plasma, P = 0.0019; for serum, P = 0.0049; and for WBP, P = 0.0089). Plasma PCR generated the highest sensitivity (91%); the sensitivities for serum and WBP PCR were 80% and 55%, respectively. The highest specificity was achieved when testing WBP (96%), which was significantly superior to the specificities achieved when testing serum (69%, P = 0.0238) and plasma (53%, P = 0.0002). No cases were PCR negative in all specimen types, and no controls were PCR positive in all specimens. This study confirms that Aspergillus PCR testing of plasma provides robust performance while utilizing commercial automated DNA extraction processes. Combining PCR testing of different blood fractions allows IA to be both confidently diagnosed and excluded. A requirement for multiple PCR-positive plasma samples provides similar diagnostic utility and is technically less demanding. Time to diagnosis may be enhanced by testing multiple contemporaneously obtained sample types. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Biostatistical analysis of quantitative immunofluorescence microscopy images.
Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C
2016-12-01
Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Determination of aflatoxins in by-products of industrial processing of cocoa beans.
Copetti, Marina V; Iamanaka, Beatriz T; Pereira, José Luiz; Lemes, Daniel P; Nakano, Felipe; Taniwaki, Marta H
2012-01-01
This study has examined the occurrence of aflatoxins in 168 samples of different fractions obtained during the processing of cocoa in manufacturing plants (shell, nibs, mass, butter, cake and powder) using an optimised methodology for cocoa by-products. The method validation was based on selectivity, linearity, limit of detection and recovery. The method was shown to be adequate for use in quantifying the contamination of cocoa by aflatoxins B(1), B(2), G(1) and G(2). Furthermore, the method was easier to use than other methods available in the literature. For aflatoxin extraction from cocoa samples, a methanol-water solution was used, and then immunoaffinity columns were employed for clean-up before the determination by high-performance liquid chromatography. A survey demonstrated a widespread occurrence of aflatoxins in cocoa by-products, although in general the levels of aflatoxins present in the fractions from industrial processing of cocoa were low. A maximum aflatoxin contamination of 13.3 ng g(-1) was found in a nib sample. The lowest contamination levels were found in cocoa butter. Continued monitoring of aflatoxins in cocoa by-products is nevertheless necessary because these toxins have a high toxicity to humans and cocoa is widely consumed by children through cocoa-containing products, like candies.
Santos, Sara; Almeida, Inês; Oliveiros, Bárbara; Castelo-Branco, Miguel
2016-01-01
Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string "(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI" was conducted. The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p < .0001; I2 = 95.86%, 94.20% to 97.05%, with 95% confidence interval, CI]. Random effects analysis [RE(183) = 0.851, .422 to .969, 95% CI] supported the evidence that the (right) amygdala responds preferentially to untrustworthy faces. Moreover, two ALE analyses performed with 6 articles (7 studies) identified the amygdala, insula and medial dorsal nuclei of thalamus as structures with negative correlation with trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial trustworthiness levels. Significant effects considering subgroup analysis based on methodological criteria were found for experiments using spatial smoothing, categorization of trustworthiness in 2 or 3 categories and paradigms which involve both explicit and implicit tasks. Significant heterogeneity between studies was found in MA, which might have arisen from inclusion of studies with smaller sample sizes and differences in methodological options. Studies using ROI analysis / small volume correction methods were more often devoted specifically to the amygdala region, with some results reporting uncorrected p-values based on mainly clinical a priori evidence of amygdala involvement in these processes. Nevertheless, we did not find significant evidence for publication bias. Our results support the role of amygdala in facial trustworthiness judgment, emphasizing its predominant role during processing of negative social signals in (untrustworthy) faces. This systematic review suggests that little consistency exists among studies' methodology, and that larger sample sizes should be preferred.
Rapid Detection of Ebola Virus with a Reagent-Free, Point-of-Care Biosensor
Baca, Justin T.; Severns, Virginia; Lovato, Debbie; Branch, Darren W.; Larson, Richard S.
2015-01-01
Surface acoustic wave (SAW) sensors can rapidly detect Ebola antigens at the point-of-care without the need for added reagents, sample processing, or specialized personnel. This preliminary study demonstrates SAW biosensor detection of the Ebola virus in a concentration-dependent manner. The detection limit with this methodology is below the average level of viremia detected on the first day of symptoms by PCR. We observe a log-linear sensor response for highly fragmented Ebola viral particles, with a detection limit corresponding to 1.9 × 104 PFU/mL prior to virus inactivation. We predict greatly improved sensitivity for intact, infectious Ebola virus. This point-of-care methodology has the potential to detect Ebola viremia prior to symptom onset, greatly enabling infection control and rapid treatment. This biosensor platform is powered by disposable AA batteries and can be rapidly adapted to detect other emerging diseases in austere conditions. PMID:25875186
Modeling Common-Sense Decisions in Artificial Intelligence
NASA Technical Reports Server (NTRS)
Zak, Michail
2010-01-01
A methodology has been conceived for efficient synthesis of dynamical models that simulate common-sense decision- making processes. This methodology is intended to contribute to the design of artificial-intelligence systems that could imitate human common-sense decision making or assist humans in making correct decisions in unanticipated circumstances. This methodology is a product of continuing research on mathematical models of the behaviors of single- and multi-agent systems known in biology, economics, and sociology, ranging from a single-cell organism at one extreme to the whole of human society at the other extreme. Earlier results of this research were reported in several prior NASA Tech Briefs articles, the three most recent and relevant being Characteristics of Dynamics of Intelligent Systems (NPO -21037), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48; Self-Supervised Dynamical Systems (NPO-30634), NASA Tech Briefs, Vol. 27, No. 3 (March 2003), page 72; and Complexity for Survival of Living Systems (NPO- 43302), NASA Tech Briefs, Vol. 33, No. 7 (July 2009), page 62. The methodology involves the concepts reported previously, albeit viewed from a different perspective. One of the main underlying ideas is to extend the application of physical first principles to the behaviors of living systems. Models of motor dynamics are used to simulate the observable behaviors of systems or objects of interest, and models of mental dynamics are used to represent the evolution of the corresponding knowledge bases. For a given system, the knowledge base is modeled in the form of probability distributions and the mental dynamics is represented by models of the evolution of the probability densities or, equivalently, models of flows of information. Autonomy is imparted to the decisionmaking process by feedback from mental to motor dynamics. This feedback replaces unavailable external information by information stored in the internal knowledge base. Representation of the dynamical models in a parameterized form reduces the task of common-sense-based decision making to a solution of the following hetero-associated-memory problem: store a set of m predetermined stochastic processes given by their probability distributions in such a way that when presented with an unexpected change in the form of an input out of the set of M inputs, the coupled motormental dynamics converges to the corresponding one of the m pre-assigned stochastic process, and a sample of this process represents the decision.
Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process
NASA Astrophysics Data System (ADS)
Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.
2015-08-01
An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.
Using continuous process improvement methodology to standardize nursing handoff communication.
Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria
2012-04-01
The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.
Agile Software Development in the Department of Defense Environment
2017-03-31
Research Methodology .............................................................................................. 17 Research Hypothesis...acquisition framework to enable greater adoption of Agile methodologies . Overview of the Research Methodology The strategy for this study was to...guidance. 17 Chapter 3 – Research Methodology This chapter defines the research methodology and processes used in the study, in an effort to
Documentation of indigenous Pacific agroforestry systems: a review of methodologies
Bill Raynor
1993-01-01
Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
ERIC Educational Resources Information Center
Ndirangu, Caroline
2017-01-01
This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…
Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean
2014-01-01
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.
Pham-Tuan, Hai; Kaskavelis, Lefteris; Daykin, Clare A; Janssen, Hans-Gerd
2003-06-15
"Metabonomics" has in the past decade demonstrated enormous potential in furthering the understanding of, for example, disease processes, toxicological mechanisms, and biomarker discovery. The same principles can also provide a systematic and comprehensive approach to the study of food ingredient impact on consumer health. However, "metabonomic" methodology requires the development of rapid, advanced analytical tools to comprehensively profile biofluid metabolites within consumers. Until now, NMR spectroscopy has been used for this purpose almost exclusively. Chromatographic techniques and in particular HPLC, have not been exploited accordingly. The main drawbacks of chromatography are the long analysis time, instabilities in the sample fingerprint and the rigorous sample preparation required. This contribution addresses these problems in the quest to develop generic methods for high-throughput profiling using HPLC. After a careful optimization process, stable fingerprints of biofluid samples can be obtained using standard HPLC equipment. A method using a short monolithic column and a rapid gradient with a high flow-rate has been developed that allowed rapid and detailed profiling of larger numbers of urine samples. The method can be easily translated into a slow, shallow-gradient high-resolution method for identification of interesting peaks by LC-MS/NMR. A similar approach has been applied for cell culture media samples. Due to the much higher protein content of such samples non-porous polymer-based small particle columns yielded the best results. The study clearly shows that HPLC can be used in metabonomic fingerprinting studies.
Genesis Sample Return Capsule Overview
NASA Technical Reports Server (NTRS)
Willcockson, Bill
2005-01-01
I. Simple Entry Capsule Concept: a) Spin-Stabilized/No Active Control Systems; b) Ballistic Entry for 11.04 km/sec Velocity; c) No Heatshield Separation During Entry; d) Parachute Deploy via g-Switch + Timer. II. Stardust Design Inheritance a) Forebody Shape; b) Seal Concepts; c) Parachute Deploy Control; d) Utah Landing Site (UTTR). III. TPS Systems a) Heatshield - Carbon-Carbon - First Planetary Entry; b) Backshell - SLA-561V - Flight Heritage from Pathfinder, MER; d) Forebody Structural Penetrations Aerothermal and TPS Design Process has the Same Methodology as Used for Pathfinder, MER Flight Vehicles.
A Robust Framework for Microbial Archaeology
Warinner, Christina; Herbig, Alexander; Mann, Allison; Yates, James A. Fellows; Weiβ, Clemens L.; Burbano, Hernán A.; Orlando, Ludovic; Krause, Johannes
2017-01-01
Microbial archaeology is flourishing in the era of high-throughput sequencing, revealing the agents behind devastating historical plagues, identifying the cryptic movements of pathogens in prehistory, and reconstructing the ancestral microbiota of humans. Here, we introduce the fundamental concepts and theoretical framework of the discipline, then discuss applied methodologies for pathogen identification and microbiome characterization from archaeological samples. We give special attention to the process of identifying, validating, and authenticating ancient microbes using high-throughput DNA sequencing data. Finally, we outline standards and precautions to guide future research in the field. PMID:28460196
Worldwide Husbanding Process Improvement: Comparative Analysis of Contracting Methodologies
2007-06-01
36 d. The ports selected as representative samples are: i. Dubai ii. Jebel Ali iii. Manama iv. Aksaz v. Valletta vi. Souda Bay vii. Augusta... VALLETTA DDG 04_05 0 VALLETTA DDG 05_06 2 DDG 3 2.65 02_03 3 VALLETTAFFG 03_04 0 VALLETTA FFG 04_05 3 VALLETTA FFG 05_06 1 FFG 1.4...1.52 02_03 3 VALLETTACG 03_04 3 VALLETTA CG 04_05 2 VALLETTA CG 05_06 0 CG 1.6 1.52 02_03 0 VALLETTACVN
de Laat, Sonya; Schwartz, Lisa
2016-01-01
Introduction Prospective informed consent is required for most research involving human participants; however, this is impracticable under some circumstances. The Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (TCPS) outlines the requirements for research involving human participants in Canada. The need for an exception to consent (deferred consent) is recognised and endorsed in the TCPS for research in individual medical emergencies; however, little is known about substitute decision-maker (SDM) experiences. A paediatric resuscitation trial (SQUEEZE) (NCT01973907) using an exception to consent process began enrolling at McMaster Children's Hospital in January 2014. This qualitative research study aims to generate new knowledge on SDM experiences with the exception to consent process as implemented in a randomised controlled trial. Methods and analysis The SDMs of children enrolled into the SQUEEZE pilot trial will be the sampling frame from which ethics study participants will be derived. Design: Qualitative research study involving individual interviews and grounded theory methodology. Participants: SDMs for children enrolled into the SQUEEZE pilot trial. Sample size: Up to 25 SDMs. Qualitative methodology: SDMs will be invited to participate in the qualitative ethics study. Interviews with consenting SDMs will be conducted in person or by telephone, taped and professionally transcribed. Participants will be encouraged to elaborate on their experience of being asked to consent after the fact and how this process occurred. Analysis: Data gathering and analysis will be undertaken simultaneously. The investigators will collaborate in developing the coding scheme, and data will be coded using NVivo. Emerging themes will be identified. Ethics and dissemination This research represents a rare opportunity to interview parents/guardians of critically ill children enrolled into a resuscitation trial without their knowledge or prior consent. Findings will inform implementation of the exception to consent process in the planned definitive SQUEEZE trial and support development of evidence-based ethics guidelines. PMID:27625066
Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo
2018-07-01
The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.
García-Guerra, Romualdo B; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Kabir, Abuzar; Furton, Kenneth G; Santana-Rodríguez, José Juan
2016-12-01
Benzotriazole UV stabilizers (BUVSs) are a group of compounds added to personal care products such as sunscreens, hair dyes, make up formulations, soaps or shampoos, among others. Direct input from beaches or another aquatic recreational areas is the main source of BUVSs incorporation to the environment, where they can be mutagenic, toxic, pseudo-persistent and bioaccumulative. Due to the low levels of concentration of these compounds found in environmental samples, an extraction process is required prior to their determination. Fabric phase sorptive extraction integrates the advanced material properties of sol-gel hybrid inorganic-organic sorbents with flexible, permeable and functionally active fabric substrates, being a highly responsive, efficient and cheap device that also can be reused. In this paper, we applied fabric phase sorptive extraction methodology to analyse six BUVSs in twenty-four seawater samples from different coastal areas of Gran Canaria Island (Spain). It was coupled to ultra high performance liquid chromatography with tandem mass spectrometry in order to achieve a fast, reliable and sensitive separation and determination of the analytes from different simple matrices, regardless of its complexity and composition. Under the optimum conditions, the proposed method provided enrichment factors of 25 times with limits of detection from 1.06 to 8.96 ng L -1 and limits of quantification from 3.54 to 29.9 ng L -1 for the analytes under study in spiked samples. Intra and inter-day relative standard deviations were between 3.97 and 20.8% for all compounds. The application of the optimized methodology to non-spiked seawater samples allows detecting and quantifying the UV 360 in the range from 41.12 to 544.9 ng L -1 . Copyright © 2016 Elsevier Ltd. All rights reserved.
Algorithm for cellular reprogramming.
Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika
2017-11-07
The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.
Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel
2017-05-01
Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using Fourier transform IR spectroscopy to analyze biological materials
Baker, Matthew J; Trevisan, Júlio; Bassan, Paul; Bhargava, Rohit; Butler, Holly J; Dorling, Konrad M; Fielden, Peter R; Fogarty, Simon W; Fullwood, Nigel J; Heys, Kelly A; Hughes, Caryn; Lasch, Peter; Martin-Hirsch, Pierre L; Obinaju, Blessing; Sockalingum, Ganesh D; Sulé-Suso, Josep; Strong, Rebecca J; Walsh, Michael J; Wood, Bayden R; Gardner, Peter; Martin, Francis L
2015-01-01
IR spectroscopy is an excellent method for biological analyses. It enables the nonperturbative, label-free extraction of biochemical information and images toward diagnosis and the assessment of cell functionality. Although not strictly microscopy in the conventional sense, it allows the construction of images of tissue or cell architecture by the passing of spectral data through a variety of computational algorithms. Because such images are constructed from fingerprint spectra, the notion is that they can be an objective reflection of the underlying health status of the analyzed sample. One of the major difficulties in the field has been determining a consensus on spectral pre-processing and data analysis. This manuscript brings together as coauthors some of the leaders in this field to allow the standardization of methods and procedures for adapting a multistage approach to a methodology that can be applied to a variety of cell biological questions or used within a clinical setting for disease screening or diagnosis. We describe a protocol for collecting IR spectra and images from biological samples (e.g., fixed cytology and tissue sections, live cells or biofluids) that assesses the instrumental options available, appropriate sample preparation, different sampling modes as well as important advances in spectral data acquisition. After acquisition, data processing consists of a sequence of steps including quality control, spectral pre-processing, feature extraction and classification of the supervised or unsupervised type. A typical experiment can be completed and analyzed within hours. Example results are presented on the use of IR spectra combined with multivariate data processing. PMID:24992094
Rapid processing of 85Kr/Kr ratios using Atom Trap Trace Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zappala, J. C.; Bailey, K.; Mueller, P.
In this paper, we report a methodology for measuring 85Kr/Kr isotopic abundances using Atom Trap Trace Analysis (ATTA) that increases sample measurement throughput by over an order of magnitude to six samples per 24 h. The noble gas isotope 85Kr (half-life = 10.7 years) is a useful tracer for young groundwater in the age range of 5–50 years. ATTA, an efficient and selective laser-based atom counting method, has recently been applied to 85Kr/Kr isotopic abundance measurements, requiring 5–10 μL of krypton gas at STP extracted from 50 to 100 L of water. Previously, a single such measurement required 48 h.more » In conclusion, our new method demonstrates that we can measure 85Kr/Kr ratios with 3–5% relative uncertainty every 4 h, on average, with the same sample requirements.« less
Polley, Spencer D.; Mori, Yasuyoshi; Watson, Julie; Perkins, Mark D.; González, Iveth J.; Notomi, Tsugunori; Chiodini, Peter L.; Sutherland, Colin J.
2010-01-01
Loop-mediated isothermal amplification (LAMP) of DNA offers the ability to detect very small quantities of pathogen DNA following minimal tissue sample processing and is thus an attractive methodology for point-of-care diagnostics. Previous attempts to diagnose malaria by the use of blood samples and LAMP have targeted the parasite small-subunit rRNA gene, with a resultant sensitivity for Plasmodium falciparum of around 100 parasites per μl. Here we describe the use of mitochondrial targets for LAMP-based detection of any Plasmodium genus parasite and of P. falciparum specifically. These new targets allow routine amplification from samples containing as few as five parasites per μl of blood. Amplification is complete within 30 to 40 min and is assessed by real-time turbidimetry, thereby offering rapid diagnosis with greater sensitivity than is achieved by the most skilled microscopist or antigen detection using lateral flow immunoassays. PMID:20554824
Rapid processing of 85Kr/Kr ratios using Atom Trap Trace Analysis
Zappala, J. C.; Bailey, K.; Mueller, P.; ...
2017-03-11
In this paper, we report a methodology for measuring 85Kr/Kr isotopic abundances using Atom Trap Trace Analysis (ATTA) that increases sample measurement throughput by over an order of magnitude to six samples per 24 h. The noble gas isotope 85Kr (half-life = 10.7 years) is a useful tracer for young groundwater in the age range of 5–50 years. ATTA, an efficient and selective laser-based atom counting method, has recently been applied to 85Kr/Kr isotopic abundance measurements, requiring 5–10 μL of krypton gas at STP extracted from 50 to 100 L of water. Previously, a single such measurement required 48 h.more » In conclusion, our new method demonstrates that we can measure 85Kr/Kr ratios with 3–5% relative uncertainty every 4 h, on average, with the same sample requirements.« less
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
Sedimentation in mountain streams: A review of methods of measurement
Hedrick, Lara B.; Anderson, James T.; Welsh, Stuart A.; Lin, Lian-Shin
2013-01-01
The goal of this review paper is to provide a list of methods and devices used to measure sediment accumulation in wadeable streams dominated by cobble and gravel substrate. Quantitative measures of stream sedimentation are useful to monitor and study anthropogenic impacts on stream biota, and stream sedimentation is measurable with multiple sampling methods. Evaluation of sedimentation can be made by measuring the concentration of suspended sediment, or turbidity, and by determining the amount of deposited sediment, or sedimentation on the streambed. Measurements of deposited sediments are more time consuming and labor intensive than measurements of suspended sediments. Traditional techniques for characterizing sediment composition in streams include core sampling, the shovel method, visual estimation along transects, and sediment traps. This paper provides a comprehensive review of methodology, devices that can be used, and techniques for processing and analyzing samples collected to aid researchers in choosing study design and equipment.
Fly Ash Porous Material using Geopolymerization Process for High Temperature Exposure
Abdullah, Mohd Mustafa Al Bakri; Jamaludin, Liyana; Hussin, Kamarudin; Bnhussain, Mohamed; Ghazali, Che Mohd Ruzaidi; Ahmad, Mohd Izzat
2012-01-01
This paper presents the results of a study on the effect of temperature on geopolymers manufactured using pozzolanic materials (fly ash). In this paper, we report on our investigation of the performance of porous geopolymers made with fly ash after exposure to temperatures from 600 °C up to 1000 °C. The research methodology consisted of pozzolanic materials (fly ash) synthesized with a mixture of sodium hydroxide and sodium silicate solution as an alkaline activator. Foaming agent solution was added to geopolymer paste. The geopolymer paste samples were cured at 60 °C for one day and the geopolymers samples were sintered from 600 °C to 1000 °C to evaluate strength loss due to thermal damage. We also studied their phase formation and microstructure. The heated geopolymers samples were tested by compressive strength after three days. The results showed that the porous geopolymers exhibited strength increases after temperature exposure. PMID:22605984
Particle Engulfment and Pushing by Solidification Interfaces. Part 1; Ground Experiments
NASA Technical Reports Server (NTRS)
Juretzko, Frank R.; Dhindaw, Brij K.; Stefanescu, Doru M.; Sen, subhayu; Curreri, Peter A.
1998-01-01
Directional solidification experiments have been carried out to determine the pushing/engulfment transition for two different metal/particle systems. The systems chosen were aluminum/zirconia particles and zinc/zirconia particles. Pure metals (99.999% Al and 99.95% Zn) and spherical particles (500 microns in diameter) were used. The particles were non-reactive with the matrices within the temperature range of interest. The experiments were conducted such as to insure a planar solid/liquid interface during solidification. Particle location before and after processing was evaluated by X-ray transmission microscopy for the Al/ZrO2 samples. All samples were characterized by optical metallography after processing. A clear methodology for the experiment evaluation was developed to unambiguously interpret the occurrence of the pushing/engulfment transition. It was found that the critical velocity for engulfment ranges from 1.9 to 2.4 micron/s for Al/ZrO2 and from 1.9 to 2.9 microns/s for Zn/ZrO2.
Liu, Zaizhi; Zu, Yuangang; Yang, Lei
2017-06-01
A microwave pretreatment method was developed to preserve pectin, naringin, and limonin contents in pomelo flavedo to allow for longer storage times and subsequent extraction of pomelo essential oil. In terms of the essential oil, microwave pretreatment performed better than hydrodistillation with respect to extraction efficiency (1.88±0.06% in 24min versus 1.91±0.08% in 240min), oxygenation fraction (48.59±1.32% versus 29.63±1.02%), energy consumption (0.15kWh versus 1.54kWh), and environmental impact (123.20g CO 2 versus 1232g CO 2 ). Microwave-pretreated samples retained higher amounts of pectin, naringin, and limonin compared with non-pretreated samples. No obvious change in the degree of pectin esterification was observed. This study shows that the proposed process is a promising methodology for both preserving valuable compounds in pomelo flavedo during storage and acquiring essential oils. Copyright © 2016 Elsevier Ltd. All rights reserved.
Frankl, Andri; Mari, Muriel; Reggiori, Fulvio
2015-01-01
The yeast Saccharomyces cerevisiae is a key model system for studying of a multitude of cellular processes because of its amenability to genetics, molecular biology and biochemical procedures. Ultrastructural examinations of this organism, though, are traditionally difficult because of the presence of a thick cell wall and the high density of cytoplasmic proteins. A series of recent methodological and technical developments, however, has revived interest in morphological analyses of yeast (e.g. 123). Here we present a review of established and new methods, from sample preparation to imaging, for the ultrastructural analysis of S. cerevisiae. We include information for the use of different fixation methods, embedding procedures, approaches for contrast enhancement, and sample visualization techniques, with references to successful examples. The goal of this review is to guide researchers that want to investigate a particular process at the ultrastructural level in yeast by aiding in the selection of the most appropriate approach to visualize a specific structure or subcellular compartment. PMID:28357267
Practical remarks on the heart rate and saturation measurement methodology
NASA Astrophysics Data System (ADS)
Kowal, M.; Kubal, S.; Piotrowski, P.; Staniec, K.
2017-05-01
A surface reflection-based method for measuring heart rate and saturation has been introduced as one having a significant advantage over legacy methods in that it lends itself for use in special applications such as those where a person’s mobility is of prime importance (e.g. during a miner’s work) and excluding the use of traditional clips. Then, a complete ATmega1281-based microcontroller platform has been described for performing computational tasks of signal processing and wireless transmission. In the next section remarks have been provided regarding the basic signal processing rules beginning with raw voltage samples of converted optical signals, their acquisition, storage and smoothing. This chapter ends with practical remarks demonstrating an exponential dependence between the minimum measurable heart rate and the readout resolution at different sampling frequencies for different cases of averaging depth (in bits). The following section is devoted strictly to the heart rate and hemoglobin oxygenation (saturation) measurement with the use of the presented platform, referenced to measurements obtained with a stationary certified pulsoxymeter.
Plasma-induced graft-polymerization of polyethylene glycol acrylate on polypropylene substrates
NASA Astrophysics Data System (ADS)
Zanini, S.; Orlandi, M.; Colombo, C.; Grimoldi, E.; Riccardi, C.
2009-08-01
A detailed study of argon plasma-induced graft-polymerization of polyethylene glycol acrylate (PEGA) on polypropylene (PP) substrates (membranes and films) is presented. The process consists of four steps: (a) plasma pre-activation of the PP substrates; (b) immersion in a PEGA solution; (c) argon plasma-induced graft-polymerization; (d) washing and drying of the samples. Influence of the solution and plasma parameters on the process efficiency evaluated in terms of amount of grafted polymer, coverage uniformity and substrates wettability, are investigated. The plasma-induced graft-polymerization of PEGA is then followed by sample weighting, water droplet adsorption time and contact angle measurements, attenuated total reflection infrared spectroscopy (ATR-IR), X-ray photoelectron spectroscopy (XPS) and atomic force microscopy (AFM) analyses. The stability of the obtained thin films was evaluated in water and in phosphate buffer saline (PBS) at 37 °C. Results clearly indicates that plasma-induced graft-polymerization of PEGA is a practical methodology for anti-fouling surface modification of materials.
Auditing as Part of the Terminology Design Life Cycle
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044
Williams, M S; Ebel, E D; Cao, Y
2013-01-01
The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, Haiming; Lin, Yaojun; Seidman, David N.
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
Measuring solids concentration in stormwater runoff: comparison of analytical methods.
Clark, Shirley E; Siu, Christina Y S
2008-01-15
Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.
Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...
2015-09-09
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
Polyphenols excreted in urine as biomarkers of total polyphenol intake.
Medina-Remón, Alexander; Tresserra-Rimbau, Anna; Arranz, Sara; Estruch, Ramón; Lamuela-Raventos, Rosa M
2012-11-01
Nutritional biomarkers have several advantages in acquiring data for epidemiological and clinical studies over traditional dietary assessment tools, such as food frequency questionnaires. While food frequency questionnaires constitute a subjective methodology, biomarkers can provide a less biased and more accurate measure of specific nutritional intake. A precise estimation of polyphenol consumption requires blood or urine sample biomarkers, although their association is usually highly complex. This article reviews recent research on urinary polyphenols as potential biomarkers of polyphenol intake, focusing on clinical and epidemiological studies. We also report a potentially useful methodology to assess total polyphenols in urine samples, which allows a rapid, simultaneous determination of total phenols in a large number of samples. This methodology can be applied in studies evaluating the utility of urinary polyphenols as markers of polyphenol intake, bioavailability and accumulation in the body.
Chip-LC-MS for label-free profiling of human serum.
Horvatovich, Peter; Govorukhina, Natalia I; Reijmers, Theo H; van der Zee, Ate G J; Suits, Frank; Bischoff, Rainer
2007-12-01
The discovery of biomarkers in easily accessible body fluids such as serum is one of the most challenging topics in proteomics requiring highly efficient separation and detection methodologies. Here, we present the application of a microfluidics-based LC-MS system (chip-LC-MS) to the label-free profiling of immunodepleted, trypsin-digested serum in comparison to conventional capillary LC-MS (cap-LC-MS). Both systems proved to have a repeatability of approximately 20% RSD for peak area, all sample preparation steps included, while repeatability of the LC-MS part by itself was less than 10% RSD for the chip-LC-MS system. Importantly, the chip-LC-MS system had a two times higher resolution in the LC dimension and resulted in a lower average charge state of the tryptic peptide ions generated in the ESI interface when compared to cap-LC-MS while requiring approximately 30 times less (~5 pmol) sample. In order to characterize both systems for their capability to find discriminating peptides in trypsin-digested serum samples, five out of ten individually prepared, identical sera were spiked with horse heart cytochrome c. A comprehensive data processing methodology was applied including 2-D smoothing, resolution reduction, peak picking, time alignment, and matching of the individual peak lists to create an aligned peak matrix amenable for statistical analysis. Statistical analysis by supervised classification and variable selection showed that both LC-MS systems could discriminate the two sample groups. However, the chip-LC-MS system allowed to assign 55% of the overall signal to selected peaks against 32% for the cap-LC-MS system.
El Hussein, Mohamed; Hirst, Sandra; Osuji, Joseph
2017-08-01
Delirium is an acute disorder of attention and cognition. It affects half of older adults in acute care settings and is a cause of increasing mortality and costs. Registered nurses (RNs) and licensed practical nurses (LPNs) frequently fail to recognize delirium. The goals of this research were to identify the reasoning processes that RNs and LPNs use to recognize delirium, to compare their reasoning processes, and to generate a theory that explains their clinical reasoning processes. Theoretical sampling was employed to elicit data from 28 participants using grounded theory methodology. Theoretical coding culminated in the emergence of Professional Socialization as the substantive theory. Professional Socialization emerged from participants' responses and was based on two social processes, specifically reasoning to uncover and reasoning to report. Professional Socialization makes explicit the similarities and variations in the clinical reasoning processes between RNs and LPNs and highlights their main concerns when interacting with delirious patients.
D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco
2013-01-01
This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.
NASA Astrophysics Data System (ADS)
Wang, Ting; Plecháč, Petr
2017-12-01
Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.
Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring
Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose
2016-01-01
Conventional wastewater treatment generates large amounts of organic matter–rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation—RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring. PMID:27854280
Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring.
Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose
2016-11-15
Conventional wastewater treatment generates large amounts of organic matter-rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation-RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
Structural health monitoring methodology for aircraft condition-based maintenance
NASA Astrophysics Data System (ADS)
Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre
2001-06-01
Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.
O'Donnell, S; Cheung, R; Bennett, K; Lagacé, C
2016-12-01
There is a paucity of information about the impact of mood and anxiety disorders on Canadians and the approaches used to manage them. To address this gap, the 2014 Survey on Living with Chronic Diseases in Canada-Mood and Anxiety Disorders Component (SLCDC-MA) was developed. The purpose of this paper is to describe the methodology of the 2014 SLCDC-MA and examine the sociodemographic characteristics of the final sample. The 2014 SLCDC-MA is a cross-sectional follow-up survey that includes Canadians from the 10 provinces aged 18 years and older with mood and/or anxiety disorders diagnosed by a health professional that are expected to last, or have already lasted, six months or more. The survey was developed by the Public Health Agency of Canada (PHAC) through an iterative, consultative process with Statistics Canada and external experts. Statistics Canada performed content testing, designed the sampling frame and strategies and collected and processed the data. PHAC used descriptive analyses to describe the respondents' sociodemographic characteristics, produced nationally representative estimates using survey weights provided by Statistics Canada, and generated variance estimates using bootstrap methodology. The final 2014 SLCDC-MA sample consists of a total of 3361 respondents (68.9% response rate). Among Canadian adults with mood and/or anxiety disorders, close to twothirds (64%) were female, over half (56%) were married/in a common-law relationship and 60% obtained a post-secondary education. Most were young or middle-aged (85%), Canadian born (88%), of non-Aboriginal status (95%), and resided in an urban setting (82%). Household income was fairly evenly distributed between the adequacy quintiles; however, individuals were more likely to report a household income adequacy within the lowest (23%) versus highest (17%) quintile. Forty-five percent reported having a mood disorder only, 24% an anxiety disorder only and 31% both kinds of disorder. The 2014 SLCDC-MA is the only national household survey to collect information on the experiences of Canadians living with a professionally diagnosed mood and/or anxiety disorder. The information collected offers insights into areas where additional support or interventions may be needed and provides baseline information for future public health research in the area of mental illness.
Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets
2017-07-01
principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for
Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments
2016-03-24
NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION
77 FR 6971 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six months.../CDC's analysis of costs to the Government is based on the current methodology (ELISA) used to test NHP... different methodology or changes in the availability of ELISA reagents will affect the amount of the user...
2010 CEOS Field Reflectance Intercomparisons Lessons Learned
NASA Technical Reports Server (NTRS)
Thome, Kurtis; Fox, Nigel
2011-01-01
This paper summarizes lessons learned from the 2009 and 2010 joint field campaigns to Tuz Golu, Turkey. Emphasis is placed on the 2010 campaign related to understanding the equipment and measurement protocols, processing schemes, and traceability to SI quantities. Participants in both 2009 and 2010 used an array of measurement approaches to determine surface reflectance. One lesson learned is that even with all of the differences in collection between groups, the differences in reflectance are currently dominated by instrumental artifacts including knowledge of the white reference. Processing methodology plays a limited role once the bi-directional reflectance of the white reference is used rather than a hemispheric-directional value. The lack of a basic set of measurement protocols, or best practices, limits a group s ability to ensure SI traceability and the development of proper error budgets. Finally, rigorous attention to sampling methodology and its impact on instrument behavior is needed. The results of the 2009 and 2010 joint campaigns clearly demonstrate both the need and utility of such campaigns and such comparisons must continue in the future to ensure a coherent set of data that can span multiple sensor types and multiple decades.
NASA Astrophysics Data System (ADS)
Pelizardi, Flavia; Bea, Sergio A.; Carrera, Jesús; Vives, Luis
2017-07-01
Mixing calculations (i.e., the calculation of the proportions in which end-members are mixed in a sample) are essential for hydrological research and water management. However, they typically require the use of conservative species, a condition that may be difficult to meet due to chemical reactions. Mixing calculation also require identifying end-member waters, which is usually achieved through End Member Mixing Analysis (EMMA). We present a methodology to help in the identification of both end-members and such reactions, so as to improve mixing ratio calculations. The proposed approach consists of: (1) identifying the potential chemical reactions with the help of EMMA; (2) defining decoupled conservative chemical components consistent with those reactions; (3) repeat EMMA with the decoupled (i.e., conservative) components, so as to identify end-members waters; and (4) computing mixing ratios using the new set of components and end-members. The approach is illustrated by application to two synthetic mixing examples involving mineral dissolution and cation exchange reactions. Results confirm that the methodology can be successfully used to identify geochemical processes affecting the mixtures, thus improving the accuracy of mixing ratios calculations and relaxing the need for conservative species.
Şakıyan, Özge
2015-05-01
The aim of present work is to optimize the formulation of a functional cake (soy-cake) to be baked in infrared-microwave combination oven. For this optimization process response surface methodology was utilized. It was also aimed to optimize the processing conditions of the combination baking. The independent variables were the baking time (8, 9, 10 min), the soy flour concentration (30, 40, 50 %) and the DATEM (diacetyltartaric acid esters of monoglycerides) concentration (0.4, 0.6 and 0.8 %). The quality parameters that were examined in the study were specific volume, weight loss, total color change and firmness of the cake samples. The results were analyzed by multiple regression; and the significant linear, quadratic, and interaction terms were used in the second order mathematical model. The optimum baking time, soy-flour concentration and DATEM concentration were found as 9.5 min, 30 and 0.72 %, respectively. The corresponding responses of the optimum points were almost comparable with those of conventionally baked soy-cakes. So it may be declared that it is possible to produce high quality soy cakes in a very short time by using infrared-microwave combination oven.
Mutlu, Selime; Kahraman, Kevser; Öztürk, Serpil
2017-02-01
The effects of microwave irradiation on resistant starch (RS) formation and functional properties in high-amylose corn starch, Hylon VII, by applying microwave-storing cycles and drying processes were investigated. The Response Surface Methodology (RSM) was used to optimize the reaction conditions, microwave time (2-4min) and power (20-100%), for RS formation. The starch:water (1:10) mixtures were cooked and autoclaved and then different microwave-storing cycles and drying (oven or freeze drying) processes were applied. The RS contents of the samples increased with increasing microwave-storing cycle. The highest RS (43.4%) was obtained by oven drying after 3 cycles of microwave treatment at 20% power for 2min. The F, p (<0.05) and R 2 values indicated that the selected models were consistent. Linear equations were obtained for oven-dried samples applied by 1 and 3 cycles of microwave with regression coefficients of 0.65 and 0.62, respectively. Quadratic equation was obtained for freeze-dried samples applied by 3 cycles of microwave with a regression coefficient of 0.83. The solubility, water binding capacity (WBC) and RVA viscosity values of the microwave applied samples were higher than those of native Hylon VII. The WBC and viscosity values of the freeze-dried samples were higher than those of the oven-dried ones. Copyright © 2016 Elsevier B.V. All rights reserved.
Byun, Min Soo; Yi, Dahyun; Lee, Jun Ho; Choe, Young Min; Sohn, Bo Kyung; Lee, Jun-Young; Choi, Hyo Jung; Baek, Hyewon; Kim, Yu Kyeong; Lee, Yun-Sang; Sohn, Chul-Ho; Mook-Jung, Inhee; Choi, Murim; Lee, Yu Jin; Lee, Dong Woo; Ryu, Seung-Ho; Kim, Shin Gyeom; Kim, Jee Wook; Woo, Jong Inn; Lee, Dong Young
2017-11-01
The Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's disease (KBASE) aimed to recruit 650 individuals, aged from 20 to 90 years, to search for new biomarkers of Alzheimer's disease (AD) and to investigate how multi-faceted lifetime experiences and bodily changes contribute to the brain changes or brain pathologies related to the AD process. All participants received comprehensive clinical and neuropsychological evaluations, multi-modal brain imaging, including magnetic resonance imaging, magnetic resonance angiography, [ 11 C]Pittsburgh compound B-positron emission tomography (PET), and [ 18 F]fluorodeoxyglucose-PET, blood and genetic marker analyses at baseline, and a subset of participants underwent actigraph monitoring and completed a sleep diary. Participants are to be followed annually with clinical and neuropsychological assessments, and biannually with the full KBASE assessment, including neuroimaging and laboratory tests. As of March 2017, in total, 758 individuals had volunteered for this study. Among them, in total, 591 participants-291 cognitively normal (CN) old-aged individuals, 74 CN young- and middle-aged individuals, 139 individuals with mild cognitive impairment (MCI), and 87 individuals with AD dementia (ADD)-were enrolled at baseline, after excluding 162 individuals. A subset of participants (n=275) underwent actigraph monitoring. The KBASE cohort is a prospective, longitudinal cohort study that recruited participants with a wide age range and a wide distribution of cognitive status (CN, MCI, and ADD) and it has several strengths in its design and methodologies. Details of the recruitment, study methodology, and baseline sample characteristics are described in this paper.
NASA Astrophysics Data System (ADS)
Schmidt, S.; Heyns, P. S.; de Villiers, J. P.
2018-02-01
In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
Kaiser, Marie; Kuwert, Philipp; Glaesmer, Heide
2015-01-01
To date the experiences of German occupation children (GOC) have been described solely in historical studies; empirical research on the psychosocial consequences growing up as German occupation children was missing. This paper provides an introduction to the background, methodological approaches and descriptive information on a sample for the first German-based empirical study on this topic. It also touches on methodical challenges and solution processes. Children born of war resemble a target group that is difficult to reach (hidden population). Therefore, an investigation needs consultation of both people from the target group and scientific experts (participatory approach) as well as specific methodological approaches. The questionnaire utilized contains adaptations of established and psychometrically validated instruments as well as adapted self-developed items. N = 146 occupation children were surveyed (mean age 63.4, 63.0% women) via press release and contact to platforms of children born of war. Despite methodological challenges an instrument to assess the target group was developed through participatory methods. The instrument shows high relevance for the target group and is highly accepted. High rates of American and French participants show the influence of networking in platforms on successful recruitment.
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Process redesign for time-based emergency admission targets.
G Leggat, Sandra; Gough, Richard; Bartram, Timothy; Stanton, Pauline; Bamber, Greg J; Ballardie, Ruth; Sohal, Amrik
2016-09-19
Purpose Hospitals have used process redesign to increase the efficiency of the emergency department (ED) to cope with increasing demand. While there are published studies suggesting a positive outcome, recent reviews have reported that it is difficult to conclude that these approaches are effective as a result of substandard research methodology. The purpose of this paper is to explore the perceptions of hospital staff on the impact of a process redesign initiative on quality of care. Design/methodology/approach A retrospective qualitative case study examining a Lean Six Sigma (LSS) initiative in a large metropolitan hospital from 2009 to 2010. Non-probability sampling identified interview subjects who, through their participation in the redesign initiative, had a detailed understanding of the implementation and outcomes of the initiative. Between April 2012 and January 2013 26 in-depth semi-structured interviews were conducted and analysed with thematic content analysis. Findings There were four important findings. First, when asked to comment on the impact of the LSS implementation, without prompting the staff spoke of quality of care. Second, there was little agreement among the participants as to whether the project had been successful. Third, despite the recognition of the need for a coordinated effort across the hospital to improve ED access, the redesign process was not successful in reducing existing divides among clinicians and among managers and clinicians. Finally, staff expressed tension between production processes to move patients more quickly and their duty of care to their patients as individuals. Originality/value One of the first studies to explore the impact of process redesign through in-depth interviews with participating staff, this study adds further evidence that organisations implementing process redesign must ensure the supporting management practices are in place.
Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection
NASA Technical Reports Server (NTRS)
Taylor, Randall; Vanek, Thomas
2011-01-01
This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.
Prediction and standard error estimation for a finite universe total when a stratum is not sampled
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, T.
1994-01-01
In the context of a universe of trucks operating in the United States in 1990, this paper presents statistical methodology for estimating a finite universe total on a second occasion when a part of the universe is sampled and the remainder of the universe is not sampled. Prediction is used to compensate for the lack of data from the unsampled portion of the universe. The sample is assumed to be a subsample of an earlier sample where stratification is used on both occasions before sample selection. Accounting for births and deaths in the universe between the two points in time,more » the detailed sampling plan, estimator, standard error, and optimal sample allocation, are presented with a focus on the second occasion. If prior auxiliary information is available, the methodology is also applicable to a first occasion.« less
EJ IWG Promising Practices for EJ Methodologies in NEPA Reviews
Report of methodologies gleaned from current agency practices identified by the NEPA Committee. These methodologies are concerning the interface of environmental justice considerations through NEPA processes.
In-situ monitoring of ? phase transformation in Ti-6Al-6V-2Sn using laser ultrasonics
NASA Astrophysics Data System (ADS)
Hinterlechner, Irina; Barriobero-Vila, Pere; Reitinger, Bernhard; Fromherz, Thomas; Requena, Guillermo; Burgholzer, Peter
2018-04-01
Titanium is of great interest for metal processing industries due to its superior material properties, but it is also quite expensive. Therefore, a detailed knowledge of ? phase transformation and consequential the distribution of ? and ? phase in titanium alloys is crucial for their material properties and as a consequence for further processing steps. Measuring the ultrasonic velocity and attenuation by laser ultrasonics technology (LUS) as a non-destructive and non-contact technique, it is possible to qualitatively monitor in-situ the phase transformation during heating the sample from room temperature up to ?. We validate LUS methodology against high energy X-ray diffraction as well as against conventional metallurgic measurements and get excellent agreement between the results of these methods.
Williams, Calum; Rughoobur, Girish; Flewitt, Andrew J; Wilkinson, Timothy D
2016-11-10
A single-step fabrication method is presented for ultra-thin, linearly variable optical bandpass filters (LVBFs) based on a metal-insulator-metal arrangement using modified evaporation deposition techniques. This alternate process methodology offers reduced complexity and cost in comparison to conventional techniques for fabricating LVBFs. We are able to achieve linear variation of insulator thickness across a sample, by adjusting the geometrical parameters of a typical physical vapor deposition process. We demonstrate LVBFs with spectral selectivity from 400 to 850 nm based on Ag (25 nm) and MgF2 (75-250 nm). Maximum spectral transmittance is measured at ∼70% with a Q-factor of ∼20.
Meanings of care in health promotion.
Falcón, Gladys Carmela Santos; Erdmann, Alacoque Lorenzini; Backes, Dirce Stein
2008-01-01
The objective of the study is to understand the meaning built by students and professors on health promotion in the teaching and learning process of health care in Nursing. It is a qualitative study using ground theory as a methodological reference. Data was collected through interviews, with three samples groups, 13 students and four professors, by classroom observation, and through meetings with nursing professors. The central subject resulting from this analysis was: constructing teaching and learning in order, disorder and self organization for a new way of caring promoting health. The teaching/learning process directed at health promotion develops in a stage of crisis, going from a state of order to a state of disorder that is uncertain and contradictory regarding what society understands about health.
The Statistical point of view of Quality: the Lean Six Sigma methodology
Viti, Andrea; Terzi, Alberto
2015-01-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253
The Statistical point of view of Quality: the Lean Six Sigma methodology.
Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto
2015-04-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.
Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A
2018-05-01
The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2 = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.
A methodology to assess performance of human-robotic systems in achievement of collective tasks
NASA Technical Reports Server (NTRS)
Howard, Ayanna M.
2005-01-01
In this paper, we present a methodology to assess system performance of human-robotic systems in achievement of collective tasks such as habitat construction, geological sampling, and space exploration.
Varas, Lautaro R; Pontes, F C; Santos, A C F; Coutinho, L H; de Souza, G G B
2015-09-15
The ion-ion-coincidence mass spectroscopy technique brings useful information about the fragmentation dynamics of doubly and multiply charged ionic species. We advocate the use of a matrix-parameter methodology in order to represent and interpret the entire ion-ion spectra associated with the ionic dissociation of doubly charged molecules. This method makes it possible, among other things, to infer fragmentation processes and to extract information about overlapped ion-ion coincidences. This important piece of information is difficult to obtain from other previously described methodologies. A Wiley-McLaren time-of-flight mass spectrometer was used to discriminate the positively charged fragment ions resulting from the sample ionization by a pulsed 800 eV electron beam. We exemplify the application of this methodology by analyzing the fragmentation and ionic dissociation of the dimethyl disulfide (DMDS) molecule as induced by fast electrons. The doubly charged dissociation was analyzed using the Multivariate Normal Distribution. The ion-ion spectrum of the DMDS molecule was obtained at an incident electron energy of 800 eV and was matrix represented using the Multivariate Distribution theory. The proposed methodology allows us to distinguish information among [CH n SH n ] + /[CH 3 ] + (n = 1-3) fragment ions in the ion-ion coincidence spectra using ion-ion coincidence data. Using the momenta balance methodology for the inferred parameters, a secondary decay mechanism is proposed for the [CHS] + ion formation. As an additional check on the methodology, previously published data on the SiF 4 molecule was re-analyzed with the present methodology and the results were shown to be statistically equivalent. The use of a Multivariate Normal Distribution allows for the representation of the whole ion-ion mass spectrum of doubly or multiply ionized molecules as a combination of parameters and the extraction of information among overlapped data. We have successfully applied this methodology to the analysis of the fragmentation of the DMDS molecule. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Nácher-Mestre, Jaime; Ibáñez, María; Serrano, Roque; Boix, Clara; Bijlsma, Lubertus; Lunestad, Bjørn Tore; Hannisdal, Rita; Alm, Martin; Hernández, Félix; Berntssen, Marc H G
2016-07-01
There is an on-going trend for developing more sustainable salmon feed in which traditionally applied marine feed ingredients are replaced with alternatives. Processed animal products (PAPs) have been re-authorized as novel high quality protein ingredients in 2013. These PAPs may harbor undesirable substances such as pharmaceuticals and metabolites which are not previously associated with salmon farming, but might cause a potential risk for feed and food safety. To control these contaminants, an analytical strategy based on a generic extraction followed by ultra-high performance liquid chromatography coupled to high resolution mass spectrometry (UHPLC-HRMS) using quadrupole time-of-flight mass analyzer (QTOF MS) was applied for wide scope screening. Quality control samples, consisting of PAP commodities spiked at 0.02, 0.1 and 0.2 mg/kg with 150 analytes, were injected in every sample batch to verify the overall method performance. The methodology was applied to 19 commercially available PAP samples from six different types of matrices from the EU animal rendering industry. This strategy allows assessing possible emergent risk exposition of the salmon farming industry to 1005 undesirables, including pharmaceuticals, several dyes and relevant metabolites. Copyright © 2016 Elsevier Ltd. All rights reserved.
Spectral Interferometry with Electron Microscopes
Talebi, Nahid
2016-01-01
Interference patterns are not only a defining characteristic of waves, but also have several applications; characterization of coherent processes and holography. Spatial holography with electron waves, has paved the way towards space-resolved characterization of magnetic domains and electrostatic potentials with angstrom spatial resolution. Another impetus in electron microscopy has been introduced by ultrafast electron microscopy which uses pulses of sub-picosecond durations for probing a laser induced excitation of the sample. However, attosecond temporal resolution has not yet been reported, merely due to the statistical distribution of arrival times of electrons at the sample, with respect to the laser time reference. This is however, the very time resolution which will be needed for performing time-frequency analysis. These difficulties are addressed here by proposing a new methodology to improve the synchronization between electron and optical excitations through introducing an efficient electron-driven photon source. We use focused transition radiation of the electron as a pump for the sample. Due to the nature of transition radiation, the process is coherent. This technique allows us to perform spectral interferometry with electron microscopes, with applications in retrieving the phase of electron-induced polarizations and reconstructing dynamics of the induced vector potential. PMID:27649932
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
Northern Marshall Islands radiological survey: sampling and analysis summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robison, W.L.; Conrado, C.L.; Eagle, R.J.
1981-07-23
A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islandsmore » and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for /sup 241/Am, 6569 for /sup 137/Cs, 4535 for /sup 239 +240/Pu, 4431 for /sup 90/Sr, 1146 for /sup 238/Pu, 269 for /sup 241/Pu, and 114 each for /sup 239/Pu and /sup 240/Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included.« less
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Sequencing CYP2D6 for the detection of poor-metabolizers in post-mortem blood samples with tramadol.
Fonseca, Suzana; Amorim, António; Costa, Heloísa Afonso; Franco, João; Porto, Maria João; Santos, Jorge Costa; Dias, Mário
2016-08-01
Tramadol concentrations and analgesic effect are dependent on the CYP2D6 enzymatic activity. It is well known that some genetic polymorphisms are responsible for the variability in the expression of this enzyme and in the individual drug response. The detection of allelic variants described as non-functional can be useful to explain some circumstances of death in the study of post-mortem cases with tramadol. A Sanger sequencing methodology was developed for the detection of genetic variants that cause absent or reduced CYP2D6 activity, such as *3, *4, *6, *8, *10 and *12 alleles. This methodology, as well as the GC/MS method for the detection and quantification of tramadol and its main metabolites in blood samples was fully validated in accordance with international guidelines. Both methodologies were successfully applied to 100 post-mortem blood samples and the relation between toxicological and genetic results evaluated. Tramadol metabolism, expressed as its metabolites concentration ratio (N-desmethyltramadol/O-desmethyltramadol), has been shown to be correlated with the poor-metabolizer phenotype based on genetic characterization. It was also demonstrated the importance of enzyme inhibitors identification in toxicological analysis. According to our knowledge, this is the first study where a CYP2D6 sequencing methodology is validated and applied to post-mortem samples, in Portugal. The developed methodology allows the data collection of post-mortem cases, which is of primordial importance to enhance the application of these genetic tools to forensic toxicology and pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Redesigning flow injection after 40 years of development: Flow programming.
Ruzicka, Jaromir Jarda
2018-01-01
Automation of reagent based assays, by means of Flow Injection (FI), is based on sample processing, in which a sample flows continuously towards and through a detector for quantification of the target analyte. The Achilles heel of this methodology, the legacy of Auto Analyzer®, is continuous reagent consumption, and continuous generation of chemical waste. However, flow programming, assisted by recent advances in precise pumping, combined with the lab-on-valve technique, allows the FI manifold to be designed around a single confluence point through which sample and reagents are sequentially directed by means of a series of flow reversals. This approach results in sample/reagent mixing analogous to the traditional FI, reduces sample and reagent consumption, and uses the stop flow technique for enhancement of the yield of chemical reactions. The feasibility of programmable Flow Injection (pFI) is documented by example of commonly used spectrophotometric assays of, phosphate, nitrate, nitrite and glucose. Experimental details and additional information are available in online tutorial http://www.flowinjectiontutorial.com/. Copyright © 2017 Elsevier B.V. All rights reserved.
Liu, Xiaowen; Pervez, Hira; Andersen, Lars W; Uber, Amy; Montissol, Sophia; Patel, Parth; Donnino, Michael W
2015-01-01
Pyruvate dehydrogenase (PDH) activity is altered in many human disorders. Current methods require tissue samples and yield inconsistent results. We describe a modified method for measuring PDH activity from isolated human peripheral blood mononuclear cells (PBMCs). RESULTS/METHODOLOGY: We found that PDH activity and quantity can be successfully measured in human PBMCs. Freeze-thaw cycles cannot efficiently disrupt the mitochondrial membrane. Processing time of up to 20 h does not affect PDH activity with proteinase inhibitor addition and a detergent concentration of 3.3% showed maximum yield. Sample protein concentration is correlated to PDH activity and quantity in human PBMCs from healthy subjects. Measuring PDH activity from PBMCs is a novel, easy and less invasive way to further understand the role of PDH in human disease.
Silva, Catarina; Cavaco, Carina; Perestrelo, Rosa; Pereira, Jorge; Câmara, José S.
2014-01-01
For a long time, sample preparation was unrecognized as a critical issue in the analytical methodology, thus limiting the performance that could be achieved. However, the improvement of microextraction techniques, particularly microextraction by packed sorbent (MEPS) and solid-phase microextraction (SPME), completely modified this scenario by introducing unprecedented control over this process. Urine is a biological fluid that is very interesting for metabolomics studies, allowing human health and disease characterization in a minimally invasive form. In this manuscript, we will critically review the most relevant and promising works in this field, highlighting how the metabolomic profiling of urine can be an extremely valuable tool for the early diagnosis of highly prevalent diseases, such as cardiovascular, oncologic and neurodegenerative ones. PMID:24958388
NASA Astrophysics Data System (ADS)
Rios-Corripio, M. A.; Rios-Leal, E.; Rojas-López, M.; Delgado-Macuil, R.
2011-01-01
A chemometric analysis of adulteration of Mexican honey by sugar syrups such as corn syrup and cane sugar syrup was realized. Fourier transform infrared spectroscopy (FTIR) was used to measure the absorption of a group of bee honey samples from central region of Mexico. Principal component analysis (PCA) was used to process FTIR spectra to determine the adulteration of bee honey. In addition to that, the content of individual sugars from honey samples: glucose, fructose, sucrose and monosaccharides was determined by using PLS-FTIR analysis validated by HPLC measurements. This analytical methodology which is based in infrared spectroscopy and chemometry can be an alternative technique to characterize and also to determine the purity and authenticity of nutritional products as bee honey and other natural products.
DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS
Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...
What about N? A methodological study of sample-size reporting in focus group studies.
Carlsen, Benedicte; Glenton, Claire
2011-03-11
Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96). Thirty seven (17%) studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Based on these findings we suggest that journals adopt more stringent requirements for focus group method reporting. The often poor and inconsistent reporting seen in these studies may also reflect the lack of clear, evidence-based guidance about deciding on sample size. More empirical research is needed to develop focus group methodology.
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
ERIC Educational Resources Information Center
Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry
2005-01-01
This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…
McCarthy, Bridie; Andrews, Tom; Hegarty, Josephine
2015-04-01
To explore family members' experiences when their loved one is undergoing chemotherapy treatment as an outpatient for newly diagnosed colorectal cancer and to develop an explanatory theory of how they process their main concern. Most individuals with cancer are now treated as outpatients and cared for by family members. International research highlights the many side effects of chemotherapy, which in the absence of specific information and/or experience can be difficult for family members to deal with. Unmet needs can have an impact on the health of both patients and family members. Classic grounded theory methodology was used for this study. Using classic grounded theory methodology, family members (n = 35) of patients undergoing chemotherapy treatment for cancer were interviewed (June 2010-July 2011). Data were analysed using the concurrent processes of constant comparative analysis, data collection, theoretical sampling and memo writing. The main concern that emerged for participants was fear of emotional collapse. This fear was dealt with through a process conceptualized as 'Emotional Resistance Building'. This is a basic social process with three phases: 'Figuring out', 'Getting on with it' and 'Uncertainty adjustment'. The phases are not linear, but interrelated as participants can be in any one or more of the phases at any one time. This theory has the potential to be used by healthcare professionals working in oncology to support family members of patients undergoing chemotherapy. New ways of supporting family members through this most difficult and challenging period are articulated within this theory. © 2014 John Wiley & Sons Ltd.
Gianfranceschi, Monica Virginia; Rodriguez-Lazaro, David; Hernandez, Marta; González-García, Patricia; Comin, Damiano; Gattuso, Antonietta; Delibato, Elisabetta; Sonnessa, Michele; Pasquali, Frederique; Prencipe, Vincenza; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Kozačinski, Lidija; Tomic, Danijela Horvatek; Zdolec, Nevijo; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John Elmerdahl; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Paiusco, Antonella; De Cesare, Alessandra; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Listeria monocytogenes requires around 7 days for final confirmation, and due to perishable nature of RTE food products, there is a clear need for an alternative methodology for detection of this pathogen. This study presents an international (at European level) ISO 16140-based validation trial of a non-proprietary real-time PCR-based methodology that can generate final results in the following day of the analysis. This methodology is based on an ISO compatible enrichment coupled to a bacterial DNA extraction and a consolidated real-time PCR assay. Twelve laboratories from six European countries participated in this trial, and soft cheese was selected as food model since it can represent a difficult matrix for the bacterial DNA extraction and real-time PCR amplification. The limit of detection observed was down to 10 CFU per 25 of sample, showing excellent concordance and accordance values between samples and laboratories (>75%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (82.75%, 96.70% and 97.62%, respectively) when the results obtained for the real-time PCR-based methods were compared to those of the ISO 11290-1 standard method. An interesting observation was that the L. monocytogenes detection by the real-time PCR method was less affected in the presence of Listeria innocua in the contaminated samples, proving therefore to be more reliable than the reference method. The results of this international trial demonstrate that the evaluated real-time PCR-based method represents an excellent alterative to the ISO standard since it shows a higher performance as well as reduce the extent of the analytical process, and can be easily implemented routinely by the competent authorities and food industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Brown, Robert D.; Gortmaker, Valerie J.
2009-01-01
Methodological and political issues arise during the designing, conducting, and reporting of campus-climate studies for LGBT students. These issues interact; making a decision about a methodological issue (e.g., sample size) has an impact on a political issue (e.g., how well the findings will be received). Ten key questions that must be addressed…
A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.
Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva
2015-11-01
It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.
Scherer, Sebastian; Kowal, Julia; Chami, Mohamed; Dandey, Venkata; Arheit, Marcel; Ringler, Philippe; Stahlberg, Henning
2014-05-01
The introduction of direct electron detectors (DED) to cryo-electron microscopy has tremendously increased the signal-to-noise ratio (SNR) and quality of the recorded images. We discuss the optimal use of DEDs for cryo-electron crystallography, introduce a new automatic image processing pipeline, and demonstrate the vast improvement in the resolution achieved by the use of both together, especially for highly tilted samples. The new processing pipeline (now included in the software package 2dx) exploits the high SNR and frame readout frequency of DEDs to automatically correct for beam-induced sample movement, and reliably processes individual crystal images without human interaction as data are being acquired. A new graphical user interface (GUI) condenses all information required for quality assessment in one window, allowing the imaging conditions to be verified and adjusted during the data collection session. With this new pipeline an automatically generated unit cell projection map of each recorded 2D crystal is available less than 5 min after the image was recorded. The entire processing procedure yielded a three-dimensional reconstruction of the 2D-crystallized ion-channel membrane protein MloK1 with a much-improved resolution of 5Å in-plane and 7Å in the z-direction, within 2 days of data acquisition and simultaneous processing. The results obtained are superior to those delivered by conventional photographic film-based methodology of the same sample, and demonstrate the importance of drift-correction. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Performance in physiology evaluation: possible improvement by active learning strategies.
Montrezor, Luís H
2016-12-01
The evaluation process is complex and extremely important in the teaching/learning process. Evaluations are constantly employed in the classroom to assist students in the learning process and to help teachers improve the teaching process. The use of active methodologies encourages students to participate in the learning process, encourages interaction with their peers, and stimulates thinking about physiological mechanisms. This study examined the performance of medical students on physiology over four semesters with and without active engagement methodologies. Four activities were used: a puzzle, a board game, a debate, and a video. The results show that engaging in activities with active methodologies before a physiology cognitive monitoring test significantly improved student performance compared with not performing the activities. We integrate the use of these methodologies with classic lectures, and this integration appears to improve the teaching/learning process in the discipline of physiology and improves the integration of physiology with cardiology and neurology. In addition, students enjoy the activities and perform better on their evaluations when they use them. Copyright © 2016 The American Physiological Society.
A New Method for Generating Probability Tables in the Unresolved Resonance Region
Holcomb, Andrew M.; Leal, Luiz C.; Rahnema, Farzad; ...
2017-04-18
One new method for constructing probability tables in the unresolved resonance region (URR) has been developed. This new methodology is an extensive modification of the single-level Breit-Wigner (SLBW) pseudo-resonance pair sequence method commonly used to generate probability tables in the URR. The new method uses a Monte Carlo process to generate many pseudo-resonance sequences by first sampling the average resonance parameter data in the URR and then converting the sampled resonance parameters to the more robust R-matrix limited (RML) format. Furthermore, for each sampled set of pseudo-resonance sequences, the temperature-dependent cross sections are reconstructed on a small grid around themore » energy of reference using the Reich-Moore formalism and the Leal-Hwang Doppler broadening methodology. We then use the effective cross sections calculated at the energies of reference to construct probability tables in the URR. The RML cross-section reconstruction algorithm has been rigorously tested for a variety of isotopes, including 16O, 19F, 35Cl, 56Fe, 63Cu, and 65Cu. The new URR method also produced normalized cross-section factor probability tables for 238U that were found to be in agreement with current standards. The modified 238U probability tables were shown to produce results in excellent agreement with several standard benchmarks, including the IEU-MET-FAST-007 (BIG TEN), IEU-MET-FAST-003, and IEU-COMP-FAST-004 benchmarks.« less
Pretend Play: Antecedent of Adult Creativity.
Russ, Sandra W
2016-01-01
This article reviews the theoretical and empirical literature in the area of pretend play as a predictor of adult creativity. There is strong evidence that processes expressed in pretend play are associated with measures of creativity, especially with divergent thinking. There is some evidence from longitudinal studies that this association is stable over time. Converging evidence suggests that cognitive and affective processes in pretend play are involved in adult creative production. However, there is a lack of consensus in the field as to whether engaging in pretend play actually facilitates creative thinking. In addition, many other variables (opportunity, tolerance for failure, motivation, work ethic, etc.) determine whether children with creative potential are actually creative in adulthood. In spite of the many methodological challenges in conducting research in the play area, it is important to continue investigating specific processes expressed in play and their developmental trajectories. Large samples in multisite studies would be ideal in investigating the ability of specific play processes to predict these creative processes and creative productivity in adulthood. © 2016 Wiley Periodicals, Inc.
Trajectory Dispersed Vehicle Process for Space Launch System
NASA Technical Reports Server (NTRS)
Statham, Tamara; Thompson, Seth
2017-01-01
The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.
Modeling Business Processes in Public Administration
NASA Astrophysics Data System (ADS)
Repa, Vaclav
During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.
Gene network analysis: from heart development to cardiac therapy.
Ferrazzi, Fulvia; Bellazzi, Riccardo; Engel, Felix B
2015-03-01
Networks offer a flexible framework to represent and analyse the complex interactions between components of cellular systems. In particular gene networks inferred from expression data can support the identification of novel hypotheses on regulatory processes. In this review we focus on the use of gene network analysis in the study of heart development. Understanding heart development will promote the elucidation of the aetiology of congenital heart disease and thus possibly improve diagnostics. Moreover, it will help to establish cardiac therapies. For example, understanding cardiac differentiation during development will help to guide stem cell differentiation required for cardiac tissue engineering or to enhance endogenous repair mechanisms. We introduce different methodological frameworks to infer networks from expression data such as Boolean and Bayesian networks. Then we present currently available temporal expression data in heart development and discuss the use of network-based approaches in published studies. Collectively, our literature-based analysis indicates that gene network analysis constitutes a promising opportunity to infer therapy-relevant regulatory processes in heart development. However, the use of network-based approaches has so far been limited by the small amount of samples in available datasets. Thus, we propose to acquire high-resolution temporal expression data to improve the mathematical descriptions of regulatory processes obtained with gene network inference methodologies. Especially probabilistic methods that accommodate the intrinsic variability of biological systems have the potential to contribute to a deeper understanding of heart development.
NASA Astrophysics Data System (ADS)
Asaithambi, Perumal; Beyene, Dejene; Aziz, Abdul Raman Abdul; Alemayehu, Esayas
2018-05-01
Treatment of landfill leachate wastewater by electrocoagulation process using an aluminium electrode was investigated in a batch electrochemical cell reactor. Response surface methodology based on central composite design was used to optimize the operating parameters for the removal of % color and % total organic carbon (TOC) together with power consumption from landfill leachate. Effects of three important independent parameters such as current density ( X 1), inter-electrode distance ( X 2) and solution pH ( X 3) of the landfill leachate sample on the % color and % TOC removal with power consumption were investigated. A quadratic model was used to predict the % color and % TOC removal with power consumption in different experimental conditions. The significance of each independent variable was calculated by analysis of variance. In order to achieve the maximum % color and % TOC removal with minimum of power consumption, the optimum conditions were about current density ( X 1)—5.25 A/dm2, inter-electrode distance ( X 2)—1 cm and initial solution of effluent pH ( X 3)—7.83, with the yield of color removal of 74.57%, and TOC removal of 51.75% with the power consumption of 14.80 kWh/m3. Electrocoagulation process could be applied to remove pollutants from industrial effluents and wastewater.
Ebshish, Ali; Yaakob, Zahira; Taufiq-Yap, Yun Hin; Bshish, Ahmed
2014-01-01
In this work; a response surface methodology (RSM) was implemented to investigate the process variables in a hydrogen production system. The effects of five independent variables; namely the temperature (X1); the flow rate (X2); the catalyst weight (X3); the catalyst loading (X4) and the glycerol-water molar ratio (X5) on the H2 yield (Y1) and the conversion of glycerol to gaseous products (Y2) were explored. Using multiple regression analysis; the experimental results of the H2 yield and the glycerol conversion to gases were fit to quadratic polynomial models. The proposed mathematical models have correlated the dependent factors well within the limits that were being examined. The best values of the process variables were a temperature of approximately 600 °C; a feed flow rate of 0.05 mL/min; a catalyst weight of 0.2 g; a catalyst loading of 20% and a glycerol-water molar ratio of approximately 12; where the H2 yield was predicted to be 57.6% and the conversion of glycerol was predicted to be 75%. To validate the proposed models; statistical analysis using a two-sample t-test was performed; and the results showed that the models could predict the responses satisfactorily within the limits of the variables that were studied. PMID:28788567
Cloud-scale genomic signals processing classification analysis for gene expression microarray data.
Harvey, Benjamin; Soo-Yeon Ji
2014-01-01
As microarray data available to scientists continues to increase in size and complexity, it has become overwhelmingly important to find multiple ways to bring inference though analysis of DNA/mRNA sequence data that is useful to scientists. Though there have been many attempts to elucidate the issue of bringing forth biological inference by means of wavelet preprocessing and classification, there has not been a research effort that focuses on a cloud-scale classification analysis of microarray data using Wavelet thresholding in a Cloud environment to identify significantly expressed features. This paper proposes a novel methodology that uses Wavelet based Denoising to initialize a threshold for determination of significantly expressed genes for classification. Additionally, this research was implemented and encompassed within cloud-based distributed processing environment. The utilization of Cloud computing and Wavelet thresholding was used for the classification 14 tumor classes from the Global Cancer Map (GCM). The results proved to be more accurate than using a predefined p-value for differential expression classification. This novel methodology analyzed Wavelet based threshold features of gene expression in a Cloud environment, furthermore classifying the expression of samples by analyzing gene patterns, which inform us of biological processes. Moreover, enabling researchers to face the present and forthcoming challenges that may arise in the analysis of data in functional genomics of large microarray datasets.
Digital-image processing and image analysis of glacier ice
Fitzpatrick, Joan J.
2013-01-01
This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.
76 FR 62068 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-06
... methodological objectives. The first objective is to test the feasibility of the proposed sampling frame and to... minutes. Results of the methodological component of the feasibility study will be used to assess the...
NASA Technical Reports Server (NTRS)
Baker, T. C. (Principal Investigator)
1982-01-01
A general methodology is presented for estimating a stratum's at-harvest crop acreage proportion for a given crop year (target year) from the crop's estimated acreage proportion for sample segments from within the stratum. Sample segments from crop years other than the target year are (usually) required for use in conjunction with those from the target year. In addition, the stratum's (identifiable) crop acreage proportion may be estimated for times other than at-harvest in some situations. A by-product of the procedure is a methodology for estimating the change in the stratum's at-harvest crop acreage proportion from crop year to crop year. An implementation of the proposed procedure as a statistical analysis system routine using the system's matrix language module, PROC MATRIX, is described and documented. Three examples illustrating use of the methodology and algorithm are provided.