Sample records for defined pre-analytical quality

  1. Quality Measures in Pre-Analytical Phase of Tissue Processing: Understanding Its Value in Histopathology.

    PubMed

    Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran

    2016-01-01

    Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence of errors in pre-analytical phase implies that a satisfactory level of quality standards was being practiced with still scope for improvement.

  2. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    PubMed

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  3. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    PubMed

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  4. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.

    PubMed

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-08-01

    Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.

  5. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan

    PubMed Central

    2017-01-01

    Background Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Materials and Methods Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician’s request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. Results The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results. PMID:28107395

  6. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    PubMed

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani hospitals focus only on the analytical phase, and none of the pre-analytical errors were recorded. Interestingly, none of the labs were internationally accredited; therefore, corrective actions are needed at these hospitals to ensure better health outcomes. Internal and External Quality Assessment Schemes (EQAS) for the pre-analytical phase at Sulaimani clinical laboratories should be implemented at public hospitals. Furthermore, lab personnel, particularly phlebotomists, need continuous training on the importance of sample quality to obtain accurate test results.

  7. [Management of pre-analytical nonconformities].

    PubMed

    Berkane, Z; Dhondt, J L; Drouillard, I; Flourié, F; Giannoli, J M; Houlbert, C; Surgat, P; Szymanowicz, A

    2010-12-01

    The main nonconformities enumerated to facilitate consensual codification. In each case, an action is defined: refusal to realize the examination with request of a new sample, request of information or correction, results cancellation, nurse or physician information. A traceability of the curative, corrective and preventive actions is needed. Then, methodology and indicators are proposed to assess nonconformity and to follow the quality improvements. The laboratory information system can be used instead of dedicated software. Tools for the follow-up of nonconformities scores are proposed. Finally, we propose an organization and some tools allowing the management and control of the nonconformities occurring during the pre-examination phase.

  8. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  9. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  10. Quality specification in haematology: the automated blood cell count.

    PubMed

    Buttarello, Mauro

    2004-08-02

    Quality specifications for automated blood cell counts include topics that go beyond the traditional analytic stage (imprecision, inaccuracy, quality control) and extend to pre- and post-analytic phases. In this review pre-analytic aspects concerning the choice of anticoagulants, maximum conservation times and differences between storage at room temperature or at 4 degrees C are considered. For the analytic phase, goals for imprecision and bias obtained with various approaches (ratio to biologic variation, state of the art, specific clinical situations) are evaluated. For the post-analytic phase, medical review criteria (algorithm, decision limit and delta check) and the structure of the report (general part and comments), which constitutes the formal act through which a laboratory communicates with clinicians, are considered. K2EDTA is considered the anticoagulant of choice for automated cell counts. Regarding storage, specimens should be analyzed as soon as possible. Storage at 4 degrees C may stabilize specimens from 24 to 72 h when complete blood count (CBC) and differential leucocyte count (DLC) is performed. For precision, analytical goals based on the state of the art are acceptable while for bias this is satisfactory only for some parameters. In haematology quality specifications for pre- and analytical phases are important, but the review criteria and the quality of the report play a central role in assuring a definite clinical value.

  11. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    PubMed

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  12. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    PubMed

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  13. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    PubMed

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence factors. This review is a summary of the most important recommendations regarding the importance of pre-analytical factors for coagulation testing and should be a tool to increase awareness about the importance of pre-analytical factors for coagulation testing.

  14. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  15. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors

    PubMed Central

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-01-01

    Objectives Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. Methods This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. Results After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% (P <0.050). In particular, specimen identification errors decreased by 0.056%, with a 96.55% improvement. Conclusion Targeted educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors. PMID:29062553

  17. Errors in clinical laboratories or errors in laboratory medicine?

    PubMed

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes in pre- and post-examination steps must be minimized to guarantee the total quality of laboratory services.

  18. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  19. [Pre-analytical stage for biomarker assessment in breast cancer: 2014 update of the GEFPICS' guidelines in France].

    PubMed

    MacGrogan, Gaëtan; Mathieu, Marie-Christine; Poulet, Bruno; Penault-Llorca, Frédérique; Vincent-Salomon, Anne; Roger, Pascal; Treilleux, Isabelle; Valent, Alexander; Antoine, Martine; Becette, Véronique; Bor, Catherine; Brabencova, Eva; Charafe-Jauffret, Emmanuelle; Chenard, Marie-Pierre; Dauplat, Marie-Mélanie; Delrée, Paul; Devouassoux, Mojgan; Fiche, Maryse; Fondrevelle, Marie-Eve; Fridman, Viviana; Garbar, Christian; Genin, Pascal; Ghnassia, Jean-Pierre; Haudebourg, Juliette; Laberge-Le Couteulx, Sophie; Loussouarn, Delphine; Maran-Gonzalez, Aurélie; Marcy, Myriam; Michenet, Patrick; Sagan, Christine; Trassard, Martine; Verriele, Véronique; Arnould, Laurent; Lacroix-Triki, Magali

    2014-10-01

    Biomarker assessment of breast cancer tumor samples is part of the routine workflow of pathology laboratories. International guidelines have recently been updated, with special regards to the pre-analytical steps that are critical for the quality of immunohistochemical and in situ hybridization procedures, whatever the biomarker analyzed. Fixation and specimen handling protocols must be standardized, validated and carefully tracked. Cooperation and training of the personnel involved in the specimen workflow (e.g. radiologists, surgeons, nurses, technicians and pathologists) are of paramount importance. The GEFPICS' update of the recommendations herein details and comments the different steps of the pre-analytical process. Application of these guidelines and participation to quality insurance programs are mandatory to ensure the correct evaluation of oncotheranostic biomarkers. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...

  1. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    NASA Astrophysics Data System (ADS)

    Cucu, Daniela; Woods, Mike

    2008-08-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.

  2. Current projects in Pre-analytics: where to go?

    PubMed

    Sapino, Anna; Annaratone, Laura; Marchiò, Caterina

    2015-01-01

    The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.

  3. [Pre-analytical quality in fluid samples cytopathology: Results of a survey from the French Society of Clinical Cytology].

    PubMed

    Courtade-Saïdi, Monique; Fleury Feith, Jocelyne

    2015-10-01

    The pre-analytical step includes sample collection, preparation, transportation and storage in the pathology unit where the diagnosis is performed. The pathologist ensures that pre-analytical conditions are in line with expectations. The lack of standardization for handling cytological samples makes this pre-analytical step difficult to harmonize. Moreover, this step depends on the nature of the sample: fresh liquid or fixed material, air-dried smears, liquid-based cytology. The aim of the study was to review the different practices in French structures of pathology on the pre-analytical phase concerning cytological fluids such as broncho-alveolar lavage (BALF), serous fluids and urine. A survey was conducted on the basis of the pre-analytical chapter of the ISO 15189 and sent to 191 French pathological structures (105 public and 86 private). Fifty-six laboratories replied to the survey. Ninety-five per cent have a computerized management system and 70% a manual on sample handling. The general instructions requested for the patients and sample identification were highly correctly filled with a short time routing and additional tests prescription. By contrast, information are variable concerning the clinical information requested and the type of tubes for collecting fluids and the volumes required as well as the actions taken in case of non-conformity. For the specific items concerning BALF, serous fluids and urine, this survey has shown a great heterogeneity according to sample collection, fixation and of clinical information. This survey demonstrates that the pre-analytical quality for BALF, serous fluids and urine is not optimal and that some corrections of the practices are recommended with a standardization of numerous steps in order to increase the reproducibility of additional tests such as immunocytochemistry, cytogenetic and molecular biology. Some recommendations have been written. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  4. Tissue is alive: New technologies are needed to address the problems of protein biomarker pre-analytical variability.

    PubMed

    Espina, Virginia; Mueller, Claudius; Edmiston, Kirsten; Sciro, Manuela; Petricoin, Emanuel F; Liotta, Lance A

    2009-08-01

    Instability of tissue protein biomarkers is a critical issue for molecular profiling. Pre-analytical variables during tissue procurement, such as time delays during which the tissue remains stored at room temperature, can cause significant variability and bias in downstream molecular analysis. Living tissue, ex vivo, goes through a defined stage of reactive changes that begin with oxidative, hypoxic and metabolic stress, and culminate in apoptosis. Depending on the delay time ex vivo, and reactive stage, protein biomarkers, such as signal pathway phosphoproteins will be elevated or suppressed in a manner which does not represent the biomarker levels at the time of excision. Proteomic data documenting reactive tissue protein changes post collection indicate the need to recognize and address tissue stability, preservation of post-translational modifications, and preservation of morphologic features for molecular analysis. Based on the analysis of phosphoproteins, one of the most labile tissue protein biomarkers, we set forth tissue procurement guidelines for clinical research. We propose technical solutions for (i) assessing the state of protein analyte preservation and specimen quality via identification of a panel of natural proteins (surrogate stability markers), and (ii) using multi-purpose fixative solution designed to stabilize, preserve and maintain proteins, nucleic acids, and tissue architecture.

  5. Tissue is alive: New technologies are needed to address the problems of protein biomarker pre-analytical variability

    PubMed Central

    Espina, Virginia; Mueller, Claudius; Edmiston, Kirsten; Sciro, Manuela; Petricoin, Emanuel F.; Liotta, Lance A.

    2010-01-01

    Instability of tissue protein biomarkers is a critical issue for molecular profiling. Pre-analytical variables during tissue procurement, such as time delays during which the tissue remains stored at room temperature, can cause significant variability and bias in downstream molecular analysis. Living tissue, ex vivo, goes through a defined stage of reactive changes that begin with oxidative, hypoxic and metabolic stress, and culminate in apoptosis. Depending on the delay time ex vivo, and reactive stage, protein biomarkers, such as signal pathway phosphoproteins will be elevated or suppressed in a manner which does not represent the biomarker levels at the time of excision. Proteomic data documenting reactive tissue protein changes post collection indicate the need to recognize and address tissue stability, preservation of post-translational modifications, and preservation of morphologic features for molecular analysis. Based on the analysis of phosphoproteins, one of the most labile tissue protein biomarkers, we set forth tissue procurement guidelines for clinical research. We propose technical solutions for (i) assessing the state of protein analyte preservation and specimen quality via identification of a panel of natural proteins (surrogate stability markers), and (ii) using multi-purpose fixative solution designed to stabilize, preserve and maintain proteins, nucleic acids, and tissue architecture. PMID:20871745

  6. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  7. Quality Indicators in Laboratory Medicine: from theory to practice. Preliminary data from the IFCC Working Group Project "Laboratory Errors and Patient Safety".

    PubMed

    Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario

    2011-05-01

    The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.

  8. Performance criteria and quality indicators for the post-analytical phase.

    PubMed

    Sciacovelli, Laura; Aita, Ada; Padoan, Andrea; Pelloso, Michela; Antonelli, Giorgia; Piva, Elisa; Chiozza, Maria Laura; Plebani, Mario

    2016-07-01

    Quality indicators (QIs) used as performance measurements are an effective tool in accurately estimating quality, identifying problems that may need to be addressed, and monitoring the processes over time. In Laboratory Medicine, QIs should cover all steps of the testing process, as error studies have confirmed that most errors occur in the pre- and post-analytical phase of testing. Aim of the present study is to provide preliminary results on QIs and related performance criteria in the post-analytical phase. This work was conducted according to a previously described study design based on the voluntary participation of clinical laboratories in the project on QIs of the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). Overall, data collected highlighted an improvement or stability in performances over time for all reported indicators thus demonstrating that the use of QIs is effective in the quality improvement strategy. Moreover, QIs data are an important source for defining the state-of-the-art concerning the error rate in the total testing process. The definition of performance specifications based on the state-of-the-art, as suggested by consensus documents, is a valuable benchmark point in evaluating the performance of each laboratory. Laboratory tests play a relevant role in the monitoring and evaluation of the efficacy of patient outcome thus assisting clinicians in decision-making. Laboratory performance evaluation is therefore crucial to providing patients with safe, effective and efficient care.

  9. Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.

    PubMed

    Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok

    2015-01-01

    Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.

  10. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa

    PubMed Central

    Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Background Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. Objectives This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Methods Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Results Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. Conclusions The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care. PMID:28879108

  11. Dried blood spot specimen quality and validation of a new pre-analytical processing method for qualitative HIV-1 PCR, KwaZulu-Natal, South Africa.

    PubMed

    Govender, Kerusha; Parboosing, Raveen; Siyaca, Ntombizandile; Moodley, Pravikrishnen

    2016-01-01

    Poor quality dried blood spot (DBS) specimens are usually rejected by virology laboratories, affecting early infant diagnosis of HIV. The practice of combining two incompletely-filled DBS in one specimen preparation tube during pre-analytical specimen processing (i.e., the two-spot method) has been implemented to reduce the number of specimens being rejected for insufficient volume. This study analysed laboratory data to describe the quality of DBS specimens and the use of the two-spot method over a one-year period, then validated the two-spot method against the standard (one-spot) method. Data on HIV-1 PCR test requests submitted in 2014 to the Department of Virology at Inkosi Albert Luthuli Central Hospital in KwaZulu-Natal province, South Africa were analysed to describe reasons for specimen rejection, as well as results of the two-spot method. The accuracy, lower limit of detection and precision of the two-spot method were assessed. Of the 88 481 specimens received, 3.7% were rejected for pre-analytical problems. Of those, 48.9% were rejected as a result of insufficient specimen volume. Two health facilities had significantly more specimen rejections than other facilities. The two-spot method prevented 10 504 specimen rejections. The Pearson correlation coefficient comparing the standard to the two-spot method was 0.997. The two-spot method was comparable with the standard method of pre-analytical specimen processing. Two health facilities were identified for targeted retraining on specimen quality. The two-spot method of DBS specimen processing can be used as an adjunct to retraining, to reduce the number of specimens rejected and improve linkage to care.

  12. Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience

    PubMed Central

    Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK

    2015-01-01

    Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569

  13. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    PubMed

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  14. [Modal failure analysis and effects in the detection of errors in the transport of samples to the clinical laboratory].

    PubMed

    Parés-Pollán, L; Gonzalez-Quintana, A; Docampo-Cordeiro, J; Vargas-Gallego, C; García-Álvarez, G; Ramos-Rodríguez, V; Diaz Rubio-García, M P

    2014-01-01

    Owing to the decrease in values of biochemical glucose parameter in some samples from external extraction centres, and the risk this implies to patient safety; it was decided to apply an adaptation of the «Health Services Failure Mode and Effects Analysis» (HFMEA) to manage risk during the pre-analytical phase of sample transportation from external centres to clinical laboratories. A retrospective study of glucose parameter was conducted during two consecutive months. The analysis was performed in its different phases: to define the HFMEA topic, assemble the team, graphically describe the process, conduct a hazard analysis, design the intervention and indicators, and identify a person to be responsible for ensuring completion of each action. The results of glucose parameter in one of the transport routes, were significantly lower (P=.006). The errors and potential causes of this problem were analysed, and criteria of criticality and detectability were applied (score≥8) in the decision tree. It was decided to: develop a document management system; reorganise extractions and transport routes in some centres; quality control of the sample container ice-packs, and the time and temperature during transportation. This work proposes quality indicators for controlling time and temperature of transported samples in the pre-analytical phase. Periodic review of certain laboratory parameters can help to detect problems in transporting samples. The HFMEA technique is useful for the clinical laboratory. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  15. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  16. Managing the Pre- and Post-analytical Phases of the Total Testing Process

    PubMed Central

    2012-01-01

    For many years, the clinical laboratory's focus on analytical quality has resulted in an error rate of 4-5 sigma, which surpasses most other areas in healthcare. However, greater appreciation of the prevalence of errors in the pre- and post-analytical phases and their potential for patient harm has led to increasing requirements for laboratories to take greater responsibility for activities outside their immediate control. Accreditation bodies such as the Joint Commission International (JCI) and the College of American Pathologists (CAP) now require clear and effective procedures for patient/sample identification and communication of critical results. There are a variety of free on-line resources available to aid in managing the extra-analytical phase and the recent publication of quality indicators and proposed performance levels by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) working group on laboratory errors and patient safety provides particularly useful benchmarking data. Managing the extra-laboratory phase of the total testing cycle is the next challenge for laboratory medicine. By building on its existing quality management expertise, quantitative scientific background and familiarity with information technology, the clinical laboratory is well suited to play a greater role in reducing errors and improving patient safety outside the confines of the laboratory. PMID:22259773

  17. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    PubMed Central

    Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine

    2016-01-01

    The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400

  18. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    PubMed

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Useful measures and models for analytical quality management in medical laboratories.

    PubMed

    Westgard, James O

    2016-02-01

    The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.

  20. Tensions in Defining Quality Pre-School Education: The Singapore Context

    ERIC Educational Resources Information Center

    Lim-Ratnam, Christina

    2013-01-01

    Over the past decade, the government in Singapore has been introducing many initiatives in the early childhood sector to raise the quality of pre-school education. Educational reforms made without consideration of the perspectives and concerns of the participants in the socio-cultural milieu would only lead to superficial implementation. This…

  1. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  2. Standardization and optimization of fluorescence in situ hybridization (FISH) for HER-2 assessment in breast cancer: A single center experience.

    PubMed

    Bogdanovska-Todorovska, Magdalena; Petrushevska, Gordana; Janevska, Vesna; Spasevska, Liljana; Kostadinova-Kunovska, Slavica

    2018-05-20

    Accurate assessment of human epidermal growth factor receptor 2 (HER-2) is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH). Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE) tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples) and DAKO Histology FISH Accessory Kit (30 samples). The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing) affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70); the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p < 0.0001). The greatest discordance (82%) between IHC and FISH was observed in IHC 2+ group. A standardized FISH protocol for HER-2 assessment, with high hybridization efficiency, is necessary due to variability in tissue processing and individual tissue characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.

  3. [The taking and transport of biological samples].

    PubMed

    Kerwat, Klaus; Kerwat, Martina; Eberhart, Leopold; Wulf, Hinnerk

    2011-05-01

    The results of microbiological tests are the foundation for a targetted therapy and the basis for monitoring infections. The quality of each and every laboratory finding depends not only on an error-free analytical process. The pre-analysis handling procedures are of particular importance. They encompass all factors and influences prior to the actual analysis. These include the correct timepoint for sample taking, the packaging and the rapid transport of the material to be investigated. Errors in the pre-analytical processing are the most frequent reasons for inappropriate findings. © Georg Thieme Verlag Stuttgart · New York.

  4. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  5. Pre-analytical phase: The automated ProTube device supports quality assurance in the phlebotomy process.

    PubMed

    Piva, Elisa; Tosato, Francesca; Plebani, Mario

    2015-12-07

    Most errors in laboratory medicine occur in the pre-analytical phase of the total testing process. Phlebotomy, a crucial step in the pre-analytical phase influencing laboratory results and patient outcome, calls for quality assurance procedures and automation in order to prevent errors and ensure patient safety. We compared the performance of a new small, automated device, the ProTube Inpeco, designed for use in phlebotomy with a complete traceability of the process, with a centralized automated system, BC ROBO. ProTube was used for 15,010 patients undergoing phlebotomy with 48,776 tubes being labeled. The mean time and standard deviation (SD) for blood sampling was 3:03 (min:sec; SD ± 1:24) when using ProTube, against 5:40 (min:sec; SD ± 1:57) when using BC ROBO. The mean number of patients per hour managed at each phlebotomy point was 16 ± 3 with ProTube, and 10 ± 2 with BC ROBO. No tubes were labeled erroneously or incorrectly, even if process failure occurred in 2.8% of cases when ProTube was used. Thanks to its cutting edge technology, the ProTube has many advantages over BC ROBO, above all in verifying patient identity, and in allowing a reduction in both identification error and tube mislabeling.

  6. Analytical quality goals derived from the total deviation from patients' homeostatic set points, with a margin for analytical errors.

    PubMed

    Bolann, B J; Asberg, A

    2004-01-01

    The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.

  7. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  8. Post-analytical Issues in Hemostasis and Thrombosis Testing.

    PubMed

    Favaloro, Emmanuel J; Lippi, Giuseppe

    2017-01-01

    Analytical concerns within hemostasis and thrombosis testing are continuously decreasing. This is essentially attributable to modern instrumentation, improvements in test performance and reliability, as well as the application of appropriate internal quality control and external quality assurance measures. Pre-analytical issues are also being dealt with in some newer instrumentation, which are able to detect hemolysis, icteria and lipemia, and, in some cases, other issues related to sample collection such as tube under-filling. Post-analytical issues are generally related to appropriate reporting and interpretation of test results, and these are the focus of the current overview, which provides a brief description of these events, as well as guidance for their prevention or minimization. In particular, we propose several strategies for improved post-analytical reporting of hemostasis assays and advise that this may provide the final opportunity to prevent serious clinical errors in diagnosis.

  9. The focus on sample quality: Influence of colon tissue collection on reliability of qPCR data

    PubMed Central

    Korenkova, Vlasta; Slyskova, Jana; Novosadova, Vendula; Pizzamiglio, Sara; Langerova, Lucie; Bjorkman, Jens; Vycital, Ondrej; Liska, Vaclav; Levy, Miroslav; Veskrna, Karel; Vodicka, Pavel; Vodickova, Ludmila; Kubista, Mikael; Verderio, Paolo

    2016-01-01

    Successful molecular analyses of human solid tissues require intact biological material with well-preserved nucleic acids, proteins, and other cell structures. Pre-analytical handling, comprising of the collection of material at the operating theatre, is among the first critical steps that influence sample quality. The aim of this study was to compare the experimental outcomes obtained from samples collected and stored by the conventional means of snap freezing and by PAXgene Tissue System (Qiagen). These approaches were evaluated by measuring rRNA and mRNA integrity of the samples (RNA Quality Indicator and Differential Amplification Method) and by gene expression profiling. The collection procedures of the biological material were implemented in two hospitals during colon cancer surgery in order to identify the impact of the collection method on the experimental outcome. Our study shows that the pre-analytical sample handling has a significant effect on the quality of RNA and on the variability of qPCR data. PAXgene collection mode proved to be more easily implemented in the operating room and moreover the quality of RNA obtained from human colon tissues by this method is superior to the one obtained by snap freezing. PMID:27383461

  10. Quality indicators in laboratory medicine: a fundamental tool for quality and patient safety.

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Marinova, Mariela; Marcuccitti, Jessica; Chiozza, Maria Laura

    2013-09-01

    The identification of reliable quality indicators (QIs) is a crucial step in enabling users to quantify the quality of laboratory services. The current lack of attention to extra-laboratory factors is in stark contrast with the body of evidence pointing to the multitude of errors that continue to occur in the pre- and post-analytical phases. Different QIs and terminologies are currently used and, therefore, there is the need to harmonize proposed QIs. A model of quality indicators (MQI) has been consensually developed by a group of clinical laboratories according to a project launched by a working group of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). The model includes 57 QIs related to key processes (35 pre-, 7 intra- and 15 post-analytical phases) and 3 to support processes. The developed MQI and the data collected provide evidence of the feasibility of the project to harmonize currently available QIs, but further efforts should be done to involve more clinical laboratories and to collect a more consistent amount of data. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. Comparisons of modern United States and Canadian malting barley cultivars with those from pre-Prohibition: IV. Malting quality assessments using standard and nonstandard measures

    USDA-ARS?s Scientific Manuscript database

    This study was conducted to identify which traits or combination of traits associated with malting quality and mashing performance could best define the differences between and within a population of pre-Prohibition malting barley varieties and a population of modern elite malting barley cultivars. ...

  12. Can current analytical quality performance of UK clinical laboratories support evidence-based guidelines for diabetes and ischaemic heart disease?--A pilot study and a proposal.

    PubMed

    Jassam, Nuthar; Yundt-Pacheco, John; Jansen, Rob; Thomas, Annette; Barth, Julian H

    2013-08-01

    The implementation of national and international guidelines is beginning to standardise clinical practice. However, since many guidelines have decision limits based on laboratory tests, there is an urgent need to ensure that different laboratories obtain the same analytical result on any sample. A scientifically-based quality control process will be a pre-requisite to provide this level of analytical performance which will support evidence-based guidelines and movement of patients across boundaries while maintaining standardised outcomes. We discuss the finding of a pilot study performed to assess UK clinical laboratories readiness to work to a higher grade quality specifications such as biological variation-based quality specifications. Internal quality control (IQC) data for HbA1c, glucose, creatinine, cholesterol and high density lipoprotein (HDL)-cholesterol were collected from UK laboratories participating in the Bio-Rad Unity QC programme. The median of the coefficient of variation (CV%) of the participating laboratories was evaluated against the CV% based on biological variation. Except creatinine, the other four analytes had a variable degree of compliance with the biological variation-based quality specifications. More than 75% of the laboratories met the biological variation-based quality specifications for glucose, cholesterol and HDL-cholesterol. Slightly over 50% of the laboratories met the analytical goal for HBA1c. Only one analyte (cholesterol) had a performance achieving the higher quality specifications consistent with 5σ. Our data from IQC do not consistently demonstrate that the results from clinical laboratories meet evidence-based quality specifications. Therefore, we propose that a graded scale of quality specifications may be needed at this stage.

  13. Development of Range Design Elements and Quality Control/Quality Assurance Guidance to Reduce Maintenance Requirements on Training Ranges

    DTIC Science & Technology

    2006-11-01

    exercises. Potential Resolution: 1. Installations must ensure that they understand the composition of civilian populations outside of their...Installation trainers, SRP Support Agency trainers or contract training specialists should layout each range based on the composition defined in the...defined time limit to respond to submittals with a pre-defined team member composition so that changes could be reviewed consistently. Only mission

  14. Comparison of thermal analytic model with experimental test results for 30-sentimeter-diameter engineering model mercury ion thruster

    NASA Technical Reports Server (NTRS)

    Oglebay, J. C.

    1977-01-01

    A thermal analytic model for a 30-cm engineering model mercury-ion thruster was developed and calibrated using the experimental test results of tests of a pre-engineering model 30-cm thruster. A series of tests, performed later, simulated a wide range of thermal environments on an operating 30-cm engineering model thruster, which was instrumented to measure the temperature distribution within it. The modified analytic model is described and analytic and experimental results compared for various operating conditions. Based on the comparisons, it is concluded that the analytic model can be used as a preliminary design tool to predict thruster steady-state temperature distributions for stage and mission studies and to define the thermal interface bewteen the thruster and other elements of a spacecraft.

  15. Pre-Analytical Components of Risk in Four Branches of Clinical Laboratory in Romania--Prospective Study.

    PubMed

    David, Remona E; Dobreanu, Minodora

    2016-01-01

    Development of quality measurement principles is a strategic point for each clinical laboratory. Preexamination process is the most critical and the most difficult to be managed. The aim of this study is to identify, quantify, and monitor the nonconformities of the pre-analytical process using quality indicators that can affect the patient's health safety in four different locations of a Romanian private clinical laboratory. The study group consisted of all the analysis requests received by the departments of biochemistry, hematology, and coagulation from January through March 2015. In order to collect the pre-analytical nonconformities, we created a "Risk Budget", using the entries from the "Evidence notebook--non-conform samples" from the above mentioned departments. The laboratory established the quality indicators by means of the risk management technique in order to identify and control the sources of errors, FMEA (Failure Modes and Effects Analyses), which had been implemented and monitored for its purposes and special needs. For the assessment of the control level over the processes, the results were transformed on the Six Sigma scale, using the Westgard calculation method and being obtained in this way the frequency with which an error may occur. (https://www.westgard. com/six-sigma-calculators.htm). The obtained results prove that the quantification and monitoring of the indicators can be a control instrument for the pre-analytic activities. The calculation of the Six Sigma value adds extra information to the study because it allows the detection of the processes which need improvement (Sigma value higher than 4 represents a well controlled process). The highest rates were observed for the hemolyzed and the lipemic samples, in the department of biochemistry and hemolyzed, insufficient sample volume, or clotted samples for the department of hematology and coagulation. Significant statistical differences between laboratories participating in the study have been recorded for these indicators. The elaborated study between the four branches of a Romanian private clinical laboratory was a challenge, and it helped in choosing strategic decisions regarding the improvement of the patient's health safety in the institution, corresponding to the accreditation requirements in accordance with ISO 15189:2013.

  16. Bridging the gap: leveraging business intelligence tools in support of patient safety and financial effectiveness.

    PubMed

    Ferranti, Jeffrey M; Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif

    2010-01-01

    Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems.

  17. Bridging the gap: leveraging business intelligence tools in support of patient safety and financial effectiveness

    PubMed Central

    Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif

    2010-01-01

    Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems. PMID:20190055

  18. Physical Property Analysis and Report for Sediments at 100-BC-5 Operable Unit, Boreholes C7505, C7506, C7507, and C7665

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindberg, Michael J.

    2010-09-28

    Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less

  19. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  20. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  1. Pre-slaughter stress and pork quality

    NASA Astrophysics Data System (ADS)

    Stajković, S.; Teodorović, V.; Baltić, M.; Karabasil, N.

    2017-09-01

    Stress is an inevitable consequence of handling of animals for slaughter. Stress conditions during transport, lairage and at slaughter induce undesirable effects on the end quality of meat such as pale, soft, exudative meat and dark firm dry meat. Hence, it is very important to define appropriate parameters for objective assessment of level of stress. Attempts to define measures of stress have been difficult and no physiological parameter has been successfully used to evaluate stress situations. One physiological change in swine associated with animal handling stress and with pork quality is an increase in blood lactate concentration. Plasma cortisol was thought to be an appropriate indicator of stress, but the concentration was not consistently changed by different stressors. Therefore, finding alternative parameters reacting to stressors, such as acute phase proteins, would be of great value for the objective evaluation of level of stress and meat quality. As the stress during pre-slaughter handling is unavoidable, the final goal is to improve transport and slaughter conditions for the animal and, as a consequence, meat quality and animal welfare.

  2. Errors in the Extra-Analytical Phases of Clinical Chemistry Laboratory Testing.

    PubMed

    Zemlin, Annalise E

    2018-04-01

    The total testing process consists of various phases from the pre-preanalytical to the post-postanalytical phase, the so-called brain-to-brain loop. With improvements in analytical techniques and efficient quality control programmes, most laboratory errors now occur in the extra-analytical phases. There has been recent interest in these errors with numerous publications highlighting their effect on service delivery, patient care and cost. This interest has led to the formation of various working groups whose mission is to develop standardized quality indicators which can be used to measure the performance of service of these phases. This will eventually lead to the development of external quality assessment schemes to monitor these phases in agreement with ISO15189:2012 recommendations. This review focuses on potential errors in the extra-analytical phases of clinical chemistry laboratory testing, some of the studies performed to assess the severity and impact of these errors and processes that are in place to address these errors. The aim of this review is to highlight the importance of these errors for the requesting clinician.

  3. Laboratory sample stability. Is it possible to define a consensus stability function? An example of five blood magnitudes.

    PubMed

    Gómez Rioja, Rubén; Martínez Espartosa, Débora; Segovia, Marta; Ibarz, Mercedes; Llopis, María Antonia; Bauça, Josep Miquel; Marzana, Itziar; Barba, Nuria; Ventura, Monserrat; García Del Pino, Isabel; Puente, Juan José; Caballero, Andrea; Gómez, Carolina; García Álvarez, Ana; Alsina, María Jesús; Álvarez, Virtudes

    2018-05-05

    The stability limit of an analyte in a biological sample can be defined as the time required until a measured property acquires a bias higher than a defined specification. Many studies assessing stability and presenting recommendations of stability limits are available, but differences among them are frequent. The aim of this study was to classify and to grade a set of bibliographic studies on the stability of five common blood measurands and subsequently generate a consensus stability function. First, a bibliographic search was made for stability studies for five analytes in blood: alanine aminotransferase (ALT), glucose, phosphorus, potassium and prostate specific antigen (PSA). The quality of every study was evaluated using an in-house grading tool. Second, the different conditions of stability were uniformly defined and the percent deviation (PD%) over time for each analyte and condition were scattered while unifying studies with similar conditions. From the 37 articles considered as valid, up to 130 experiments were evaluated and 629 PD% data were included (106 for ALT, 180 for glucose, 113 for phosphorus, 145 for potassium and 85 for PSA). Consensus stability equations were established for glucose, potassium, phosphorus and PSA, but not for ALT. Time is the main variable affecting stability in medical laboratory samples. Bibliographic studies differ in recommedations of stability limits mainly because of different specifications for maximum allowable error. Definition of a consensus stability function in specific conditions can help laboratories define stability limits using their own quality specifications.

  4. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  5. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    PubMed

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Engineering and Design: Chemical Data Quality Management for Hazardous, Toxic, Radioactive Waste Remedial Activities

    DTIC Science & Technology

    This regulation prescribes Chemical Data Quality Management (CDQM) responsibilities and procedures for projects involving hazardous, toxic and/or radioactive waste (HTRW) materials. Its purpose is to assure that the analytical data meet project data quality objectives. This is the umbrella regulation that defines CDQM activities and integrates all of the other U.S. Army Corps of Engineers (USACE) guidance on environmental data quality management .

  7. A Guide to Pathways through the Pre-Five Quality Process.

    ERIC Educational Resources Information Center

    Strathclyde Regional Council, Glasgow (Scotland).

    This guide describes a quality process for external and internal evaluation of the elementary school education department. The term "pathway" is used to define routes through the quality process that describe any school administrative activity in terms of the indicators and examples of good practice. There are five pathways: process…

  8. Quality and safety aspects in histopathology laboratory

    PubMed Central

    Adyanthaya, Soniya; Jose, Maji

    2013-01-01

    Histopathology is an art of analyzing and interpreting the shapes, sizes and architectural patterns of cells and tissues within a given specific clinical background and a science by which the image is placed in the context of knowledge of pathobiology, to arrive at an accurate diagnosis. To function effectively and safely, all the procedures and activities of histopathology laboratory should be evaluated and monitored accurately. In histopathology laboratory, the concept of quality control is applicable to pre-analytical, analytical and post-analytical activities. Ensuring safety of working personnel as well as environment is also highly important. Safety issues that may come up in a histopathology lab are primarily those related to potentially hazardous chemicals, biohazardous materials, accidents linked to the equipment and instrumentation employed and general risks from electrical and fire hazards. This article discusses quality management system which can ensure quality performance in histopathology laboratory. The hazards in pathology laboratories and practical safety measures aimed at controlling the dangers are also discussed with the objective of promoting safety consciousness and the practice of laboratory safety. PMID:24574660

  9. Defining Quality in Cardiovascular Imaging: A Scientific Statement From the American Heart Association.

    PubMed

    Shaw, Leslee J; Blankstein, Ron; Jacobs, Jill E; Leipsic, Jonathon A; Kwong, Raymond Y; Taqueti, Viviany R; Beanlands, Rob S B; Mieres, Jennifer H; Flamm, Scott D; Gerber, Thomas C; Spertus, John; Di Carli, Marcelo F

    2017-12-01

    The aims of the current statement are to refine the definition of quality in cardiovascular imaging and to propose novel methodological approaches to inform the demonstration of quality in imaging in future clinical trials and registries. We propose defining quality in cardiovascular imaging using an analytical framework put forth by the Institute of Medicine whereby quality was defined as testing being safe, effective, patient-centered, timely, equitable, and efficient. The implications of each of these components of quality health care are as essential for cardiovascular imaging as they are for other areas within health care. Our proposed statement may serve as the foundation for integrating these quality indicators into establishing designations of quality laboratory practices and developing standards for value-based payment reform for imaging services. We also include recommendations for future clinical research to fulfill quality aims within cardiovascular imaging, including clinical hypotheses of improving patient outcomes, the importance of health status as an end point, and deferred testing options. Future research should evolve to define novel methods optimized for the role of cardiovascular imaging for detecting disease and guiding treatment and to demonstrate the role of cardiovascular imaging in facilitating healthcare quality. © 2017 American Heart Association, Inc.

  10. 15 CFR 742.2 - Proliferation of chemical and biological weapons.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... medical, analytical, diagnostic, and food testing kits that consist of pre-packaged materials of defined... health purposes: (1) Test kits containing no more than 300 grams of any chemical controlled by ECCN 1C350... part 745 of the EAR). Such test kits are controlled by ECCN 1C395 for CB and CW reasons, to States not...

  11. 15 CFR 742.2 - Proliferation of chemical and biological weapons.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... medical, analytical, diagnostic, and food testing kits that consist of pre-packaged materials of defined... health purposes: (1) Test kits containing no more than 300 grams of any chemical controlled by ECCN 1C350... part 745 of the EAR). Such test kits are controlled by ECCN 1C395 for CB and CW reasons, to States not...

  12. 15 CFR 742.2 - Proliferation of chemical and biological weapons.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., analytical, diagnostic, and food testing kits that consist of pre-packaged materials of defined composition... purposes: (1) Test kits containing no more than 300 grams of any chemical controlled by ECCN 1C350.b or .c... the EAR). Such test kits are controlled by ECCN 1C395 for CB and CW reasons, to States not Party to...

  13. Quality Control of RNA Preservation and Extraction from Paraffin-Embedded Tissue: Implications for RT-PCR and Microarray Analysis

    PubMed Central

    Pichler, Martin; Zatloukal, Kurt

    2013-01-01

    Analysis of RNA isolated from fixed and paraffin-embedded tissues is widely used in biomedical research and molecular pathological diagnostics. We have performed a comprehensive and systematic investigation of the impact of factors in the pre-analytical workflow, such as different fixatives, fixation time, RNA extraction method and storage of tissues in paraffin blocks, on several downstream reactions including complementary DNA (cDNA) synthesis, quantitative reverse transcription polymerase chain reaction (qRT-PCR) and microarray hybridization. We compared the effects of routine formalin fixation with the non-crosslinking, alcohol-based Tissue Tek Xpress Molecular Fixative (TTXMF, Sakura Finetek), and cryopreservation as gold standard for molecular analyses. Formalin fixation introduced major changes into microarray gene expression data and led to marked gene-to-gene variations in delta-ct values of qRT-PCR. We found that qRT-PCR efficiency and gene-to-gene variations were mainly attributed to differences in the efficiency of cDNA synthesis as the most sensitive step. These differences could not be reliably detected by quality assessment of total RNA isolated from formalin-fixed tissues by electrophoresis or spectrophotometry. Although RNA from TTXMF fixed samples was as fragmented as RNA from formalin fixed samples, much higher cDNA yield and lower ct-values were obtained in qRT-PCR underlining the negative impact of crosslinking by formalin. In order to better estimate the impact of pre-analytical procedures such as fixation on the reliability of downstream analysis, we applied a qRT-PCR-based assay using amplicons of different length and an assay measuring the efficiency of cDNA generation. Together these two assays allowed better quality assessment of RNA extracted from fixed and paraffin-embedded tissues and should be used to supplement quality scores derived from automated electrophoresis. A better standardization of the pre-analytical workflow, application of additional quality controls and detailed sample information would markedly improve the comparability and reliability of molecular studies based on formalin-fixed and paraffin-embedded tissue samples. PMID:23936242

  14. IL8 and IL16 levels indicate serum and plasma quality.

    PubMed

    Kofanova, Olga; Henry, Estelle; Quesada, Rocio Aguilar; Bulla, Alexandre; Linares, Hector Navarro; Lescuyer, Pierre; Shea, Kathi; Stone, Mars; Tybring, Gunnel; Bellora, Camille; Betsou, Fay

    2018-02-09

    Longer pre-centrifugation times alter the quality of serum and plasma samples. Markers for such delays in sample processing and hence for the sample quality, have been identified. Twenty cytokines in serum, EDTA plasma and citrate plasma samples were screened for changes in concentration induced by extended blood pre-centrifugation delays at room temperature. The two cytokines that showed the largest changes were further validated for their "diagnostic performance" in identifying serum or plasma samples with extended pre-centrifugation times. In this study, using R&D Systems ELISA kits, EDTA plasma samples and serum samples with a pre-centrifugation delay longer than 24 h had an IL16 concentration higher than 313 pg/mL, and an IL8 concentration higher than 125 pg/mL, respectively. EDTA plasma samples with a pre-centrifugation delay longer than 48 h had an IL16 concentration higher than 897 pg/mL, citrate plasma samples had an IL8 concentration higher than 21.5 pg/mL and serum samples had an IL8 concentration higher than 528 pg/mL. These robust and accurate tools, based on simple and commercially available ELISA assays can greatly facilitate qualification of serum and plasma legacy collections with undocumented pre-analytics.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.

    This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less

  17. Pre-Analytical Considerations for Successful Next-Generation Sequencing (NGS): Challenges and Opportunities for Formalin-Fixed and Paraffin-Embedded Tumor Tissue (FFPE) Samples

    PubMed Central

    Arreaza, Gladys; Qiu, Ping; Pang, Ling; Albright, Andrew; Hong, Lewis Z.; Marton, Matthew J.; Levitan, Diane

    2016-01-01

    In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC), circulating tumor DNA (ctDNA), etc.), tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE) is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC), in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.). Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research. PMID:27657050

  18. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  19. State-of-Science Approaches to Determine Sensitive Taxa for Water Quality Criteria Derivation

    EPA Science Inventory

    Current Ambient Water Quality Criteria (AWQC) guidelines specify pre-defined taxa diversity requirements, which has limited chemical-specific criteria development in the U.S. to less than 100 chemicals. A priori knowledge of sensitive taxa to toxicologically similar groups of che...

  20. A New Approach to Standardize Multicenter Studies: Mobile Lab Technology for the German Environmental Specimen Bank

    PubMed Central

    Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko

    2014-01-01

    Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120–150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project “Labor der Zukunft” (future’s lab technology). This laboratory includes a 14.7 m2 reception area to record medical history and exposure-relevant behavior, a 21.1 m2 examination room to record dental fillings and for blood withdrawal, a 15.5 m2 biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m2 personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB. PMID:25141120

  1. A new approach to standardize multicenter studies: mobile lab technology for the German Environmental Specimen Bank.

    PubMed

    Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko

    2014-01-01

    Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120-150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project "Labor der Zukunft" (future's lab technology). This laboratory includes a 14.7 m(2) reception area to record medical history and exposure-relevant behavior, a 21.1 m(2) examination room to record dental fillings and for blood withdrawal, a 15.5 m(2) biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m(2) personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB.

  2. Effect of storage duration on cytokine stability in human serum and plasma.

    PubMed

    Vincent, Fabien B; Nim, Hieu T; Lee, Jacinta P W; Morand, Eric F; Harris, James

    2018-06-14

    Quantification of analytes such as cytokines in serum samples is intrinsic to translational research in immune diseases. Optimising pre-analytical conditions is critical for ensuring study quality, including evaluation of cytokine stability. We aimed to evaluate the effect on cytokine stability of storage duration prior to freezing of serum, and compare to plasma samples obtained from patients with systemic lupus erythematosus (SLE). Protein stability was analysed by simultaneously quantifying 18 analytes using a custom multi-analyte profile in SLE patient serum and plasma samples that had been prospectively stored at 4 °C for pre-determined periods between 0 and 30 days, prior to freezing. Six analytes were excluded from analysis, because most tested samples were above or below the limit of detection. Amongst the 12 analysed proteins, 11 did not show significant signal degradation. Significant signal degradation was observed from the fourth day of storage for a single analyte, CCL19. Proteins levels were more stable in unseparated serum compared to plasma for most analytes, with the exception of IL-37 which appears slightly more stable in plasma. Based on this, a maximum 3 days of storage at 4 °C for unseparated serum samples is recommended for biobanked samples intended for cytokine analysis in studies of human immune disease. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. The CCLM contribution to improvements in quality and patient safety.

    PubMed

    Plebani, Mario

    2013-01-01

    Clinical laboratories play an important role in improving patient care. The past decades have seen unbelievable, often unpredictable improvements in analytical performance. Although the seminal concept of the brain-to-brain laboratory loop has been described more than four decades ago, there is now a growing awareness about the importance of extra-analytical aspects in laboratory quality. According to this concept, all phases and activities of the testing cycle should be assessed, monitored and improved in order to decrease the total error rates thereby improving patients' safety. Clinical Chemistry and Laboratory Medicine (CCLM) not only has followed the shift in perception of quality in the discipline, but has been the catalyst for promoting a large debate on this topic, underlining the value of papers dealing with errors in clinical laboratories and possible remedies, as well as new approaches to the definition of quality in pre-, intra-, and post-analytical steps. The celebration of the 50th anniversary of the CCLM journal offers the opportunity to recall and mention some milestones in the approach to quality and patient safety and to inform our readers, as well as laboratory professionals, clinicians and all the stakeholders of the willingness of the journal to maintain quality issues as central to its interest even in the future.

  4. User and System-Based Quality Criteria for Evaluating Information Resources and Services Available from Federal Websites: Final Report.

    ERIC Educational Resources Information Center

    Wyman, Steven K.; And Others

    This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…

  5. The Hazardous-Drums Project: A Multiweek Laboratory Exercise for General Chemistry Involving Environmental, Quality Control, and Cost Evaluation

    ERIC Educational Resources Information Center

    Hayes, David; Widanski, Bozena

    2013-01-01

    A laboratory experiment is described that introduces students to "real-world" hazardous waste management issues chemists face. The students are required to define an analytical problem, choose a laboratory analysis method, investigate cost factors, consider quality-control issues, interpret the meaning of results, and provide management…

  6. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    PubMed Central

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  7. The Social Reconstructionist Approach to Teacher Education: A Necessary Component to Achieving Excellence and Quality Education for All

    ERIC Educational Resources Information Center

    Mayne, Hope

    2014-01-01

    Improving all aspects of the quality of education is dependent on preparing teachers to become critical citizens. The social reconstructionist approach to teacher education is essential to transforming an education system defined by inequity, issues of quality, and issues of access. How do pre-service teachers perceive the mission of quality…

  8. Technology to improve quality and accountability.

    PubMed

    Kay, Jonathan

    2006-01-01

    A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.

  9. Cerebrospinal fluid and blood biomarkers for neurodegenerative dementias: An update of the Consensus of the Task Force on Biological Markers in Psychiatry of the World Federation of Societies of Biological Psychiatry.

    PubMed

    Lewczuk, Piotr; Riederer, Peter; O'Bryant, Sid E; Verbeek, Marcel M; Dubois, Bruno; Visser, Pieter Jelle; Jellinger, Kurt A; Engelborghs, Sebastiaan; Ramirez, Alfredo; Parnetti, Lucilla; Jack, Clifford R; Teunissen, Charlotte E; Hampel, Harald; Lleó, Alberto; Jessen, Frank; Glodzik, Lidia; de Leon, Mony J; Fagan, Anne M; Molinuevo, José Luis; Jansen, Willemijn J; Winblad, Bengt; Shaw, Leslie M; Andreasson, Ulf; Otto, Markus; Mollenhauer, Brit; Wiltfang, Jens; Turner, Martin R; Zerr, Inga; Handels, Ron; Thompson, Alexander G; Johansson, Gunilla; Ermann, Natalia; Trojanowski, John Q; Karaca, Ilker; Wagner, Holger; Oeckl, Patrick; van Waalwijk van Doorn, Linda; Bjerke, Maria; Kapogiannis, Dimitrios; Kuiperij, H Bea; Farotti, Lucia; Li, Yi; Gordon, Brian A; Epelbaum, Stéphane; Vos, Stephanie J B; Klijn, Catharina J M; Van Nostrand, William E; Minguillon, Carolina; Schmitz, Matthias; Gallo, Carla; Lopez Mato, Andrea; Thibaut, Florence; Lista, Simone; Alcolea, Daniel; Zetterberg, Henrik; Blennow, Kaj; Kornhuber, Johannes

    2018-06-01

    In the 12 years since the publication of the first Consensus Paper of the WFSBP on biomarkers of neurodegenerative dementias, enormous advancement has taken place in the field, and the Task Force takes now the opportunity to extend and update the original paper. New concepts of Alzheimer's disease (AD) and the conceptual interactions between AD and dementia due to AD were developed, resulting in two sets for diagnostic/research criteria. Procedures for pre-analytical sample handling, biobanking, analyses and post-analytical interpretation of the results were intensively studied and optimised. A global quality control project was introduced to evaluate and monitor the inter-centre variability in measurements with the goal of harmonisation of results. Contexts of use and how to approach candidate biomarkers in biological specimens other than cerebrospinal fluid (CSF), e.g. blood, were precisely defined. Important development was achieved in neuroimaging techniques, including studies comparing amyloid-β positron emission tomography results to fluid-based modalities. Similarly, development in research laboratory technologies, such as ultra-sensitive methods, raises our hopes to further improve analytical and diagnostic accuracy of classic and novel candidate biomarkers. Synergistically, advancement in clinical trials of anti-dementia therapies energises and motivates the efforts to find and optimise the most reliable early diagnostic modalities. Finally, the first studies were published addressing the potential of cost-effectiveness of the biomarkers-based diagnosis of neurodegenerative disorders.

  10. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Mistakes in a stat laboratory: types and frequency.

    PubMed

    Plebani, M; Carraro, P

    1997-08-01

    Application of Total Quality Management concepts to laboratory testing requires that the total process, including preanalytical and postanalytical phases, be managed so as to reduce or, ideally, eliminate all defects within the process itself. Indeed a "mistake" can be defined as any defect during the entire testing process, from ordering tests to reporting results. We evaluated the frequency and types of mistakes found in the "stat" section of the Department of Laboratory Medicine of the University-Hospital of Padova by monitoring four different departments (internal medicine, nephrology, surgery, and intensive care unit) for 3 months. Among a total of 40490 analyses, we identified 189 laboratory mistakes, a relative frequency of 0.47%. The distribution of mistakes was: preanalytical 68.2%, analytical 13.3%, and postanalytical 18.5%. Most of the laboratory mistakes (74%) did not affect patients' outcome. However, in 37 patients (19%), laboratory mistakes were associated with further inappropriate investigations, thus resulting in an unjustifiable increase in costs. Moreover, in 12 patients (6.4%) laboratory mistakes were associated with inappropriate care or inappropriate modification of therapy. The promotion of quality control and continuous improvement of the total testing process, including pre- and postanalytical phases, seems to be a prerequisite for an effective laboratory service.

  12. Quality Assurance in Biobanking for Pre-Clinical Research

    PubMed Central

    Simeon-Dubach, Daniel; Zeisberger, Steffen M.; Hoerstrup, Simon P.

    2016-01-01

    It is estimated that not less than USD 28 billion are spent each year in the USA alone on irreproducible pre-clinical research, which is not only a fundamental loss of investment and resources but also a strong inhibitor of efficiency for upstream processes regarding the translation towards clinical applications and therapies. The issues and cost of irreproducibility has mainly been published on pre-clinical research. In contrast to pre-clinical research, test material is often being transferred into humans in clinical research. To protect treated human subjects and guarantee a defined quality standard in the field of clinical research, the manufacturing and processing infrastructures have to strictly follow and adhere to certain (inter-)national quality standards. It is assumed and suggested by the authors that by an implementation of certain quality standards within the area of pre-clinical research, billions of USD might be saved and the translation phase of promising pre-clinical results towards clinical applications may substantially be improved. In this review, we discuss how an implementation of a quality assurance (QA) management system might positively improve sample quality and sustainability within pre-clinically focused biobank infrastructures. Biobanks are frequently positioned at the very beginning of the biomedical research value chain, and, since almost every research material has been stored in a biobank during the investigated life cycle, biobanking seems to be of substantial importance from this perspective. The role model of a QA-regulated biobank structure can be found in biobanks within the context of clinical research organizations such as in regenerative medicine clusters. PMID:27781023

  13. Validation of an ultra-high-performance liquid chromatography-tandem mass spectrometry method to quantify illicit drug and pharmaceutical residues in wastewater using accuracy profile approach.

    PubMed

    Hubert, Cécile; Roosen, Martin; Levi, Yves; Karolak, Sara

    2017-06-02

    The analysis of biomarkers in wastewater has become a common approach to assess community behavior. This method is an interesting way to estimate illicit drug consumption in a given population: by using a back calculation method, it is therefore possible to quantify the amount of a specific drug used in a community and to assess the consumption variation at different times and locations. Such a method needs reliable analytical data since the determination of a concentration in the ngL -1 range in a complex matrix is difficult and not easily reproducible. The best analytical method is liquid chromatography - mass spectrometry coupling after solid-phase extraction or on-line pre-concentration. Quality criteria are not specially defined for this kind of determination. In this context, it was decided to develop an UHPLC-MS/MS method to analyze 10 illicit drugs and pharmaceuticals in wastewater treatment plant influent or effluent using a pre-concentration on-line system. A validation process was then carried out using the accuracy profile concept as an innovative tool to estimate the probability of getting prospective results within specified acceptance limits. Influent and effluent samples were spiked with known amounts of the 10 compounds and analyzed three times a day for three days in order to estimate intra-day and inter-day variations. The matrix effect was estimated for each compound. The developed method can provide at least 80% of results within ±25% limits except for compounds that are degraded in influent. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Inorganic chemical analysis of environmental materials—A lecture series

    USGS Publications Warehouse

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  15. Effects of pre-analytical variables on flow cytometric diagnosis of canine lymphoma: A retrospective study (2009-2015).

    PubMed

    Comazzi, S; Cozzi, M; Bernardi, S; Zanella, D R; Aresu, L; Stefanello, D; Marconato, L; Martini, V

    2018-02-01

    Flow cytometry (FC) is increasingly being used for immunophenotyping and staging of canine lymphoma. The aim of this retrospective study was to assess pre-analytical variables that might influence the diagnostic utility of FC of lymph node (LN) fine needle aspirate (FNA) specimens from dogs with lymphoproliferative diseases. The study included 987 cases with LN FNA specimens sent for immunophenotyping that were submitted to a diagnostic laboratory in Italy from 2009 to 2015. Cases were grouped into 'diagnostic' and 'non-diagnostic'. Pre-analytical factors analysed by univariate and multivariate analyses were animal-related factors (breed, age, sex, size), operator-related factors (year, season, shipping method, submitting veterinarian) and sample-related factors (type of sample material, cellular concentration, cytological smears, artefacts). The submitting veterinarian, sample material, sample cellularity and artefacts affected the likelihood of having a diagnostic sample. The availability of specimens from different sites and of cytological smears increased the odds of obtaining a diagnostic result. Major artefacts affecting diagnostic utility included poor cellularity and the presence of dead cells. Flow cytometry on LN FNA samples yielded conclusive results in more than 90% of cases with adequate sample quality and sampling conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Standardizing in vitro diagnostics tasks in clinical trials: a call for action.

    PubMed

    Lippi, Giuseppe; Simundic, Ana-Maria; Rodriguez-Manas, Leocadio; Bossuyt, Patrick; Banfi, Giuseppe

    2016-05-01

    Translational research is defined as the process of applying ideas, insights and discoveries generated through basic scientific inquiry to treatment or prevention of human diseases. Although precise information is lacking, several lines of evidence attest that up to 95% early-phase studies may not translate into tangible outcomes for improving clinical management. Major theoretical hurdles exist in the translational process, but is it also undeniable that many studies may have failed for practical reasons, such as the use of inappropriate diagnostic testing for evaluating efficacy, effectiveness or safety of a given medical intervention, or poor quality in laboratory testing. This can generate biased test results and result in misconceptions during data interpretation, eventually leading to no clinical benefit, possible harm, and a waste of valuable resources. From a genuine economic perspective, it can be estimated that over 10 million euros of funding may be lost each year in clinical trials in the European Union due to preanalytical and analytical problems. These are mostly attributions to the heterogeneity of current guidelines and recommendations for the testing process, to the poor evidence base for basic pre-analytical, analytical and post-analytical requirements in clinical trials, and to the failure to thoughtfully integrate the perspectives of clinicians, patients, nurses and diagnostic companies in laboratory best practices. The most rational means for filling the gap between what we know and what we practice in clinical trials cannot discount the development of multidisciplinary teams including research scientists, clinicians, nurses, patients associations and representative of in vitro diagnostic (IVD) companies, who should actively interplay and collaborate with laboratory professionals to adapt and disseminate evidence-based recommendations about biospecimen collection and management into the research settings, from preclinical to phase III studies.

  17. Application of Analytical Quality by Design concept for bilastine and its degradation impurities determination by hydrophilic interaction liquid chromatographic method.

    PubMed

    Terzić, Jelena; Popović, Igor; Stajić, Ana; Tumpa, Anja; Jančić-Stojanović, Biljana

    2016-06-05

    This paper deals with the development of hydrophilic interaction liquid chromatographic (HILIC) method for the analysis of bilastine and its degradation impurities following Analytical Quality by Design approach. It is the first time that the method for bilastine and its impurities is proposed. The main objective was to identify the conditions where an adequate separation in minimal analysis duration could be achieved within a robust region. Critical process parameters which have the most influence on method performance were defined as acetonitrile content in the mobile phase, pH of the aqueous phase and ammonium acetate concentration in the aqueous phase. Box-Behnken design was applied for establishing a relationship between critical process parameters and critical quality attributes. The defined mathematical models and Monte Carlo simulations were used to identify the design space. Fractional factorial design was applied for experimental robustness testing and the method is validated to verify the adequacy of selected optimal conditions: the analytical column Luna(®) HILIC (100mm×4.6mm, 5μm particle size); mobile phase consisted of acetonitrile-aqueous phase (50mM ammonium acetate, pH adjusted to 5.3 with glacial acetic acid) (90.5:9.5, v/v); column temperature 30°C, mobile phase flow rate 1mLmin(-1), wavelength of detection 275nm. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Analysis of potential genotoxic impurities in rabeprazole active pharmaceutical ingredient via Liquid Chromatography-tandem Mass Spectrometry, following quality-by-design principles for method development.

    PubMed

    Iliou, Katerina; Malenović, Anđelija; Loukas, Yannis L; Dotsikas, Yannis

    2018-02-05

    A novel Liquid Chromatography-tandem mass spectrometry (LC-MS/MS) method is presented for the quantitative determination of two potential genotoxic impurities (PGIs) in rabeprazole active pharmaceutical ingredient (API). In order to overcome the analytical challenges in the trace analysis of PGIs, a development procedure supported by Quality-by-Design (QbD) principles was evaluated. The efficient separation between rabeprazole and the two PGIs in the shortest analysis time was set as the defined analytical target profile (ATP) and to this purpose utilization of a switching valve allowed the flow to be sent to waste when rabeprazole was eluted. The selected critical quality attributes (CQAs) were the separation criterion s between the critical peak pair and the capacity factor k of the last eluted compound. The effect of the following critical process parameters (CPPs) on the CQAs was studied: %ACN content, the pH and the concentration of the buffer salt in the mobile phase, as well as the stationary phase of the analytical column. D-Optimal design was implemented to set the plan of experiments with UV detector. In order to define the design space, Monte Carlo simulations with 5000 iterations were performed. Acceptance criteria were met for C 8 column (50×4mm, 5μm) , and the region having probability π≥95% to achieve satisfactory values of all defined CQAs was computed. The working point was selected with the mobile phase consisting ‎of ACN, ammonium formate 11mM at a ratio 31/69v/v with pH=6,8 for the water phase. The LC protocol was transferred to LC-MS/MS and validated according to ICH guidelines. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  20. Analytical chemistry of PCBs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, M.D.

    Analytical Chemistry of PCBs offers a review of physical, chemical, commercial, environmental and biological properties of PCBs. It also defines and discusses six discrete steps of analysis: sampling, extraction, cleanup, determination, data reduction, and quality assurance. The final chapter provides a discussion on collaborative testing - the ultimate step in method evaluation. Dr. Erickson also provides a bibliography of over 1200 references, critical reviews of primary literature, and five appendices which present ancillary material on PCB nomen-clature, physical properties, composition of commercial mixtures, mass spectra characteristics, and PGC/ECD chromatograms.

  1. Quality performance of laboratory testing in pharmacies: a collaborative evaluation.

    PubMed

    Zaninotto, Martina; Miolo, Giorgia; Guiotto, Adriano; Marton, Silvia; Plebani, Mario

    2016-11-01

    The quality performance and the comparability between results of pharmacies point-of-care-testing (POCT) and institutional laboratories have been evaluated. Eight pharmacies participated in the project: a capillary specimen collected by the pharmacist and, simultaneously, a lithium-heparin sample drawn by a physician of laboratory medicine for the pharmacy customers (n=106) were analyzed in the pharmacy and in the laboratory, respectively. Glucose, cholesterol, HDL-cholesterol, triglycerides, creatinine, uric acid, aspartate aminotransferase, alanine aminotransferase, were measured using: Reflotron, n=5; Samsung, n=1; Cardiocheck PA, n=1; Cholestech LDX, n=1 and Cobas 8000. The POCT analytical performance only (phase 2) were evaluated testing, in pharmacies and in the laboratory, the lithium heparin samples from a female drawn fasting daily in a week, and a control sample containing high concentrations of glucose, cholesterol and triglycerides. For all parameters, except triglycerides, the slopes showed a satisfactory correlation. For triglycerides, a median value higher in POCT in comparison to the laboratory (1.627 mmol/L vs. 0.950 mmol/L) has been observed. The agreement in the subjects classification, demonstrates that for glucose, 70% of the subjects show concentrations below the POCT recommended level (5.8-6.1 mmol/L), while 56% are according to the laboratory limit (<5.6 mmol/L). Total cholesterol exhibits a similar trend while POCT triglycerides show a greater percentage of increased values (21% vs. 9%). The reduction in triglycerides bias (phase 2) suggests that differences between POCT and central laboratory is attributable to a pre-analytical problem. The results confirm the acceptable analytical performance of POCT pharmacies and specific criticisms in the pre- and post-analytical phases.

  2. Implementation of a new 'community' laboratory CD4 service in a rural health district in South Africa extends laboratory services and substantially improves local reporting turnaround time.

    PubMed

    Coetzee, L M; Cassim, N; Glencross, D K

    2015-12-16

    The CD4 integrated service delivery model (ITSDM) provides for reasonable access to pathology services across South Africa (SA) by offering three new service tiers that extend services into remote, under-serviced areas. ITSDM identified Pixley ka Seme as such an under-serviced district. To address the poor service delivery in this area, a new ITSDM community (tier 3) laboratory was established in De Aar, SA. Laboratory performance and turnaround time (TAT) were monitored post implementation to assess the impact on local service delivery. Using the National Health Laboratory Service Corporate Data Warehouse, CD4 data were extracted for the period April 2012-July 2013 (n=11,964). Total mean TAT (in hours) was calculated and pre-analytical and analytical components assessed. Ongoing testing volumes, as well as external quality assessment performance across ten trials, were used to indicate post-implementation success. Data were analysed using Stata 12. Prior to the implementation of CD4 testing at De Aar, the total mean TAT was 20.5 hours. This fell to 8.2 hours post implementation, predominantly as a result of a lower pre-analytical mean TAT reducing from a mean of 18.9 to 1.8 hours. The analytical testing TAT remained unchanged after implementation and monthly test volumes increased by up to 20%. External quality assessment indicated adequate performance. Although subjective, questionnaires sent to facilities reported improved service delivery. Establishing CD4 testing in a remote community laboratory substantially reduces overall TAT. Additional community CD4 laboratories should be established in under-serviced areas, especially where laboratory infrastructure is already in place.

  3. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  4. Is a pre-analytical process for urinalysis required?

    PubMed

    Petit, Morgane; Beaudeux, Jean-Louis; Majoux, Sandrine; Hennequin, Carole

    2017-10-01

    For the reliable urinary measurement of calcium, phosphate and uric acid, a pre-analytical process by adding acid or base to urine samples at laboratory is recommended in order to dissolve precipitated solutes. Several studies on different kind of samples and analysers have previously shown that a such pre-analytical treatment is useless. The objective was to study the necessity of pre-analytical treatment of urine on samples collected using the V-Monovette ® (Sarstedt) system and measured on the analyser Architect C16000 (Abbott Diagnostics). Sixty urinary samples of hospitalized patients were selected (n=30 for calcium and phosphate, and n=30 for uric acid). After acidification of urine samples for measurement of calcium and phosphate, and alkalinisation for measurement of uric acid respectively, differences between results before and after the pre-analytical treatment were compared to acceptable limits recommended by the French society of clinical biology (SFBC). No difference in concentration between before and after pre-analytical treatment of urine samples exceeded acceptable limits from SFBC for measurement of calcium and uric acid. For phosphate, only one sample exceeded these acceptable limits, showing a result paradoxically lower after acidification. In conclusion, in agreement with previous study, our results show that acidification or alkalinisation of urine samples from 24 h urines or from urination is not a pre-analytical necessity for measurement of calcium, phosphate and uric acid.

  5. Cost effectiveness of adopted quality requirements in hospital laboratories.

    PubMed

    Hamza, Alneil; Ahmed-Abakur, Eltayib; Abugroun, Elsir; Bakhit, Siham; Holi, Mohamed

    2013-01-01

    The present study was designed in quasi-experiment to assess adoption of the essential clauses of particular clinical laboratory quality management requirements based on international organization for standardization (ISO 15189) in hospital laboratories and to evaluate the cost effectiveness of compliance to ISO 15189. The quality management intervention based on ISO 15189 was conceded through three phases; pre - intervention phase, Intervention phase and Post-intervention phase. In pre-intervention phase the compliance to ISO 15189 was 49% for study group vs. 47% for control group with P value 0.48, while the post intervention results displayed 54% vs. 79% for study group and control group respectively in compliance to ISO 15189 and statistically significant difference (P value 0.00) with effect size (Cohen's d) of (0.00) in pre-intervention phase and (0.99) in post - intervention phase. The annual average cost per-test for the study group and control group was 1.80 ± 0.25 vs. 1.97 ± 0.39, respectively with P value 0.39 whereas the post-intervention results showed that the annual average total costs per-test for study group and control group was 1.57 ± 0.23 vs 2.08 ± 0.38, P value 0.019 respectively, with cost-effectiveness ratio of (0.88) in pre -intervention phase and (0.52) in post-intervention phase. The planned adoption of quality management requirements (QMS) in clinical laboratories had great effect to increase the compliance percent with quality management system requirement, raise the average total cost effectiveness, and improve the analytical process capability of the testing procedure.

  6. PepsNMR for 1H NMR metabolomic data pre-processing.

    PubMed

    Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette

    2018-08-17

    In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Defining a roadmap for harmonizing quality indicators in Laboratory Medicine: a consensus statement on behalf of the IFCC Working Group "Laboratory Error and Patient Safety" and EFLM Task and Finish Group "Performance specifications for the extra-analytical phases".

    PubMed

    Sciacovelli, Laura; Panteghini, Mauro; Lippi, Giuseppe; Sumarac, Zorica; Cadamuro, Janne; Galoro, César Alex De Olivera; Pino Castro, Isabel Garcia Del; Shcolnik, Wilson; Plebani, Mario

    2017-08-28

    The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group "Performance specifications for the extra-analytical phases" (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project.

  8. Fifteen years of external quality assessment in leukemia/lymphoma immunophenotyping in The Netherlands and Belgium: A way forward.

    PubMed

    Preijers, Frank W M B; van der Velden, Vincent H J; Preijers, Tim; Brooimans, Rik A; Marijt, Erik; Homburg, Christa; van Montfort, Kees; Gratama, Jan W

    2016-05-01

    In 1985, external quality assurance was initiated in the Netherlands to reduce the between-laboratory variability of leukemia/lymphoma immunophenotyping and to improve diagnostic conclusions. This program consisted of regular distributions of test samples followed by biannual plenary participant meetings in which results were presented and discussed. A scoring system was developed in which the quality of results was rated by systematically reviewing the pre-analytical, analytical, and post-analytical assay stages using three scores, i.e., correct (A), minor fault (B), and major fault (C). Here, we report on 90 consecutive samples distributed to 40-61 participating laboratories between 1998 and 2012. Most samples contained >20% aberrant cells, mainly selected from mature lymphoid malignancies (B or T cell) and acute leukemias (myeloid or lymphoblastic). In 2002, minimally required monoclonal antibody (mAb) panels were introduced, whilst methodological guidelines for all three assay stages were implemented. Retrospectively, we divided the study into subsequent periods of 4 ("initial"), 4 ("learning"), and 7 years ("consolidation") to detect "learning effects." Uni- and multivariate models showed that analytical performance declined since 2002, but that post-analytical performance improved during the entire period. These results emphasized the need to improve technical aspects of the assay, and reflected improved interpretational skills of the participants. A strong effect of participant affiliation in all three assay stages was observed: laboratories in academic and large peripheral hospitals performed significantly better than those in small hospitals. © 2015 International Clinical Cytometry Society. © 2015 International Clinical Cytometry Society.

  9. International Council for Standardization in Haematology (ICSH) Recommendations for Laboratory Measurement of Direct Oral Anticoagulants.

    PubMed

    Gosselin, Robert C; Adcock, Dorothy M; Bates, Shannon M; Douxfils, Jonathan; Favaloro, Emmanuel J; Gouin-Thibault, Isabelle; Guillermo, Cecilia; Kawai, Yohko; Lindhoff-Last, Edelgard; Kitchen, Steve

    2018-03-01

    This guidance document was prepared on behalf of the International Council for Standardization in Haematology (ICSH) for providing haemostasis-related guidance documents for clinical laboratories. This inaugural coagulation ICSH document was developed by an ad hoc committee, comprised of international clinical and laboratory direct acting oral anticoagulant (DOAC) experts. The committee developed consensus recommendations for laboratory measurement of DOACs (dabigatran, rivaroxaban, apixaban and edoxaban), which would be germane for laboratories assessing DOAC anticoagulation. This guidance document addresses all phases of laboratory DOAC measurements, including pre-analytical (e.g. preferred time sample collection, preferred sample type, sample stability), analytical (gold standard method, screening and quantifying methods) and post analytical (e.g. reporting units, quality assurance). The committee addressed the use and limitations of screening tests such as prothrombin time, activated partial thromboplastin time as well as viscoelastic measurements of clotting blood and point of care methods. Additionally, the committee provided recommendations for the proper validation or verification of performance of laboratory assays prior to implementation for clinical use, and external quality assurance to provide continuous assessment of testing and reporting method. Schattauer GmbH Stuttgart.

  10. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  11. Thawing as a critical pre-analytical step in the lipidomic profiling of plasma samples: New standardized protocol.

    PubMed

    Pizarro, Consuelo; Arenzana-Rámila, Irene; Pérez-del-Notario, Nuria; Pérez-Matute, Patricia; González-Sáiz, José María

    2016-03-17

    Lipid profiling is a promising tool for the discovery and subsequent identification of biomarkers associated with various diseases. However, data quality is quite dependent on the pre-analytical methods employed. To date, potential confounding factors that may affect lipid metabolite levels after the thawing of plasma for biomarker exploration studies have not been thoroughly evaluated. In this study, by means of experimental design methodology, we performed the first in-depth examination of the ways in which thawing conditions affect lipid metabolite levels. After the optimization stage, we concluded that temperature, sample volume and the thawing method were the determining factors that had to be exhaustively controlled in the thawing process to ensure the quality of biomarker discovery. Best thawing conditions were found to be: 4 °C, with 0.25 mL of human plasma and ultrasound (US) thawing. The new US proposed thawing method was quicker than the other methods we studied, allowed more features to be identified and increased the signal of the lipids. In view of its speed, efficiency and detectability, the US thawing method appears to be a simple, economical method for the thawing of plasma samples, which could easily be applied in clinical laboratories before lipid profiling studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Feasibility and quality development of biomaterials in the pretest studies of the German National Cohort.

    PubMed

    Kühn, A; Nieters, A; Köttgen, A; Goek, O N; Michels, K; Nöthlings, U; Jacobs, G; Meisinger, C; Pessler, F; Akmatov, M F; Kühnisch, J; Moebus, S; Glocker, E; Naus, S; Keimling, M; Leitzmann, M; Linseisen, J; Sarioglu, H; von Toerne, C; Hauck, S M; Wallaschofski, H; Wichmann, H E; Illig, Thomas

    2014-11-01

    The German National Cohort (GNC) is designed to address research questions concerning a wide range of possible causes of major chronic diseases (e.g. cancer, diabetes, infectious, allergic, neurologic and cardiovascular diseases) as well as to identify risk factors and prognostic biomarkers for early diagnosis and prevention of these diseases. The collection of biomaterials in combination with extensive information from questionnaires and medical examinations represents one of the central study components. In two pretest studies of the German National Cohort conducted between 2011 and 2013, a range of biomaterials from a defined number of participants was collected. Ten study centres were involved in pretest 1 and 18 study centres were involved in pretest 2. Standard operation procedures (SOP) were developed and evaluated to minimize pre-analytical artefacts during biosample collection. Within the pretest studies different aspects concerning feasibility of sample collection/preparation [pretest 1 (a)] and quality control of biomarkers and proteome analyses were investigated [pretest 1 (b), (c)]. Additionally, recruitment of study participants for specific projects and examination procedures of all study centres in a defined time period according to common standards as well as transportation and decentralized storage of biological samples were tested (pretest 2). These analyses will serve as the basis for the biomaterial collection in the main study of the GNC starting in 2014. Participants, randomly chosen from the population (n = 1000 subjects recruited at ten study sites in pretest 1) were asked to donate blood, urine, saliva and stool samples. Additionally, nasal and oropharyngeal swabs were collected at the study sites and nasal swabs were collected by the participants at home. SOPs for sample collection, preparation, storage and transportation were developed and adopted for pretest 2. In pretest 2, 18 study sites (n = 599 subjects) collected biomaterials mostly identical to pretest 1. Biomarker analyses to test the quality of the biomaterials were performed. In pretest 1 and 2, it was feasible to collect all biomaterials from nearly all invited participants without major problems. The mean response rate of the subjects was 95 %. As one important result we found for example that after blood draw the cellular fraction should be separated from the plasma and serum fractions during the first hour with no significant variation for up to 6 h at 4 ℃ for all analysed biomarkers. Moreover, quality control of samples using a proteomics approach showed no significant clustering of proteins according to different storage conditions. All developed SOPs were validated for use in the main study after some adaptation and modification. Additionally, electronic and paper documentation sheets were developed and tested to record time stamps, volumes, freezing times, and aliquot numbers of the collected biomaterials. The collection of the biomaterials was feasible without major problems at all participating study sites. However, the processing times were in some cases too long. To avoid pre-analytical artefacts in sample collection, appropriate standardisation among the study sites is necessary. To achieve this, blood and urine collection will have to be adapted to specific conditions of usage of liquid handling robots, which will be available at all participating study centres in the main study of the GNC. Strict compliance with the SOPs, thorough training of the staff and accurate documentation are mandatory to obtain high sample quality for later analyses. The so obtained biomaterials represent a valuable resource for research on infectious and other common complex diseases in the GNC.

  13. Sleep state classification using pressure sensor mats.

    PubMed

    Baran Pouyan, M; Nourani, M; Pompeo, M

    2015-08-01

    Sleep state detection is valuable in assessing patient's sleep quality and in-bed general behavior. In this paper, a novel classification approach of sleep states (sleep, pre-wake, wake) is proposed that uses only surface pressure sensors. In our method, a mobility metric is defined based on successive pressure body maps. Then, suitable statistical features are computed based on the mobility metric. Finally, a customized random forest classifier is employed to identify various classes including a new class for pre-wake state. Our algorithm achieves 96.1% and 88% accuracies for two (sleep, wake) and three (sleep, pre-wake, wake) class identification, respectively.

  14. Comprehensive quality evaluation of medical Cannabis sativa L. inflorescence and macerated oils based on HS-SPME coupled to GC-MS and LC-HRMS (q-exactive orbitrap®) approach.

    PubMed

    Calvi, Lorenzo; Pentimalli, Daniela; Panseri, Sara; Giupponi, Luca; Gelmini, Fabrizio; Beretta, Giangiacomo; Vitali, Davide; Bruno, Massimo; Zilio, Emanuela; Pavlovic, Radmila; Giorgi, Annamaria

    2018-02-20

    There are at least 554 identified compounds in C. sativa L., among them 113 phytocannabinoids and 120 terpenes. Phytocomplex composition differences between the pharmaceutical properties of different medical cannabis chemotype have been attributed to strict interactions, defined as 'entourage effect', between cannabinoids and terpenes as a result of synergic action. The chemical complexity of its bioactive constituents highlight the need for standardised and well-defined analytical approaches able to characterise the plant chemotype, the herbal drug quality as well as to monitor the quality of pharmaceutical cannabis extracts and preparations. Hence, in the first part of this study an analytical procedures involving the combination of headspace-solid-phase microextraction (HS-SPME) coupled to GC-MS and High Resolution Mass-Spectrometry LC-HRMS (Orbitrap ® ) were set up, validated and applied for the in-depth profiling and fingerprinting of cannabinoids and terpenes in two authorised medical grade varieties of Cannabis sativa L. inflorescences (Bedrocan ® and Bediol ® ) and in obtained macerated oils. To better understand the trend of all volatile compounds and cannabinoids during oil storage a new procedure for cannabis macerated oil preparation without any thermal step was tested and compared with the existing conventional methods to assess the potentially detrimental effect of heating on overall product quality. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. The Effects of Combinations of Cognitive Impairment and Pre-frailty on Adverse Outcomes from a Prospective Community-Based Cohort Study of Older Chinese People.

    PubMed

    Yu, Ruby; Morley, John E; Kwok, Timothy; Leung, Jason; Cheung, Osbert; Woo, Jean

    2018-01-01

    To examine how various combinations of cognitive impairment (overall performance and specific domains) and pre-frailty predict risks of adverse outcomes; and to determine whether cognitive frailty may be defined as the combination of cognitive impairment and the presence of pre-frailty. Community-based cohort study. Chinese men and women ( n  = 3,491) aged 65+ without dementia, Parkinson's disease and/or frailty at baseline. Frailty was characterized using the Cardiovascular Health Study criteria. Overall cognitive impairment was defined by a Cantonese Mini-Mental Status Examination (CMMSE) total score (<21/24/27, depending on participants' educational levels); delayed recall impairment by a CMMSE delayed recall score (<3); and language and praxis impairment by a CMMSE language and praxis score (<9). Adverse outcomes included poor quality of life, physical limitation, increased cumulative hospital stay, and mortality. Compared to those who were robust and cognitively intact at baseline, those who were robust but cognitively impaired were more likely to develop pre-frailty/frailty after 4 years ( P  < 0.01). Compared to participants who were robust and cognitively intact at baseline, those who were pre-frail and with overall cognitive impairment had lower grip strength ( P  < 0.05), lower gait speed ( P  < 0.01), poorer lower limb strength ( P  < 0.01), and poorer delayed recall at year 4 [OR, 1.6; 95% confidence interval (CI), 1.2-2.3]. They were also associated with increased risks of poor quality of life (OR, 1.5; 95% CI, 1.1-2.2) and incident physical limitation at year 4 (OR, 1.8; 95% CI, 1.3-2.5), increased cumulative hospital stay at year 7 (OR, 1.5; 95% CI, 1.1-2.1), and mortality over an average of 12 years (OR, 1.5; 95% CI, 1.0-2.1) after adjustment for covariates. There was no significant difference in risks of adverse outcomes between participants who were pre-frail, with/without cognitive impairment at baseline. Similar results were obtained with delayed recall and language and praxis impairments. Robust and cognitively impaired participants had higher risks of becoming pre-frail/frail over 4 years compared with those with normal cognition. Cognitive impairment characterized by the CMMSE overall score or its individual domain score improved the predictive power of pre-frailty for poor quality of life, incident physical limitation, increased cumulative hospital stay, and mortality. Our findings support to the concept that cognitive frailty may be defined as the occurrence of both cognitive impairment and pre-frailty, not necessarily progressing to dementia.

  16. Questa Baseline and Pre-Mining Ground-Water Quality Investigation. 1. Depth to Bedrock Determinations Using Shallow Seismic Data Acquired in the Straight Creek Drainage Near Red River, New Mexico

    USGS Publications Warehouse

    Powers, Michael H.; Burton, Bethany L.

    2004-01-01

    In late May and early June of 2002, the U.S. Geological Survey (USGS) acquired four P-wave seismic profiles across the Straight Creek drainage near Red River, New Mexico. The data were acquired to support a larger effort to investigate baseline and pre-mining ground-water quality in the Red River basin (Nordstrom and others, 2002). For ground-water flow modeling, knowledge of the thickness of the valley fill material above the bedrock is required. When curved-ray refraction tomography was used with the seismic first arrival times, the resulting images of interval velocity versus depth clearly show a sharp velocity contrast where the bedrock interface is expected. The images show that the interpreted buried bedrock surface is neither smooth nor sharp, but it is clearly defined across the valley along the seismic line profiles. The bedrock models defined by the seismic refraction images are consistent with the well data.

  17. Integrity, standards, and QC-related issues with big data in pre-clinical drug discovery.

    PubMed

    Brothers, John F; Ung, Matthew; Escalante-Chong, Renan; Ross, Jermaine; Zhang, Jenny; Cha, Yoonjeong; Lysaght, Andrew; Funt, Jason; Kusko, Rebecca

    2018-06-01

    The tremendous expansion of data analytics and public and private big datasets presents an important opportunity for pre-clinical drug discovery and development. In the field of life sciences, the growth of genetic, genomic, transcriptomic and proteomic data is partly driven by a rapid decline in experimental costs as biotechnology improves throughput, scalability, and speed. Yet far too many researchers tend to underestimate the challenges and consequences involving data integrity and quality standards. Given the effect of data integrity on scientific interpretation, these issues have significant implications during preclinical drug development. We describe standardized approaches for maximizing the utility of publicly available or privately generated biological data and address some of the common pitfalls. We also discuss the increasing interest to integrate and interpret cross-platform data. Principles outlined here should serve as a useful broad guide for existing analytical practices and pipelines and as a tool for developing additional insights into therapeutics using big data. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Towards European urinalysis guidelines. Introduction of a project under European Confederation of Laboratory Medicine.

    PubMed

    Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G

    2000-07-01

    Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.

  19. Pre-analytical method for NMR-based grape metabolic fingerprinting and chemometrics.

    PubMed

    Ali, Kashif; Maltese, Federica; Fortes, Ana Margarida; Pais, Maria Salomé; Verpoorte, Robert; Choi, Young Hae

    2011-10-10

    Although metabolomics aims at profiling all the metabolites in organisms, data quality is quite dependent on the pre-analytical methods employed. In order to evaluate current methods, different pre-analytical methods were compared and used for the metabolic profiling of grapevine as a model plant. Five grape cultivars from Portugal in combination with chemometrics were analyzed in this study. A common extraction method with deuterated water and methanol was found effective in the case of amino acids, organic acids, and sugars. For secondary metabolites like phenolics, solid phase extraction with C-18 cartridges showed good results. Principal component analysis, in combination with NMR spectroscopy, was applied and showed clear distinction among the cultivars. Primary metabolites such as choline, sucrose, and leucine were found discriminating for 'Alvarinho', while elevated levels of alanine, valine, and acetate were found in 'Arinto' (white varieties). Among the red cultivars, higher signals for citrate and GABA in 'Touriga Nacional', succinate and fumarate in 'Aragonês', and malate, ascorbate, fructose and glucose in 'Trincadeira', were observed. Based on the phenolic profile, 'Arinto' was found with higher levels of phenolics as compared to 'Alvarinho'. 'Trincadeira' showed lowest phenolics content while higher levels of flavonoids and phenylpropanoids were found in 'Aragonês' and 'Touriga Nacional', respectively. It is shown that the metabolite composition of the extract is highly affected by the extraction procedure and this consideration has to be taken in account for metabolomics studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Cost Effectiveness of Adopted Quality Requirements in Hospital Laboratories

    PubMed Central

    HAMZA, Alneil; AHMED-ABAKUR, Eltayib; ABUGROUN, Elsir; BAKHIT, Siham; HOLI, Mohamed

    2013-01-01

    Background The present study was designed in quasi-experiment to assess adoption of the essential clauses of particular clinical laboratory quality management requirements based on international organization for standardization (ISO 15189) in hospital laboratories and to evaluate the cost effectiveness of compliance to ISO 15189. Methods: The quality management intervention based on ISO 15189 was conceded through three phases; pre – intervention phase, Intervention phase and Post-intervention phase. Results: In pre-intervention phase the compliance to ISO 15189 was 49% for study group vs. 47% for control group with P value 0.48, while the post intervention results displayed 54% vs. 79% for study group and control group respectively in compliance to ISO 15189 and statistically significant difference (P value 0.00) with effect size (Cohen’s d) of (0.00) in pre-intervention phase and (0.99) in post – intervention phase. The annual average cost per-test for the study group and control group was 1.80 ± 0.25 vs. 1.97 ± 0.39, respectively with P value 0.39 whereas the post-intervention results showed that the annual average total costs per-test for study group and control group was 1.57 ± 0.23 vs 2.08 ± 0.38, P value 0.019 respectively, with cost-effectiveness ratio of (0.88) in pre -intervention phase and (0.52) in post-intervention phase. Conclusion: The planned adoption of quality management requirements (QMS) in clinical laboratories had great effect to increase the compliance percent with quality management system requirement, raise the average total cost effectiveness, and improve the analytical process capability of the testing procedure. PMID:23967422

  1. Developing a tool for the preparation of GMP audit of pharmaceutical contract manufacturer.

    PubMed

    Linna, Anu; Korhonen, Mirka; Mannermaa, Jukka-Pekka; Airaksinen, Marja; Juppo, Anne Mari

    2008-06-01

    Outsourcing is rapidly growing in the pharmaceutical industry. When the manufacturing activities are outsourced, control of the product's quality has to be maintained. One way to confirm contract manufacturer's GMP (Good Manufacturing Practice) compliance is auditing. Audits can be supported for instance by using GMP questionnaires. The objective of this study was to develop a tool for the audit preparation of pharmaceutical contract manufacturers and to validate its contents by using Delphi method. At this phase of the study the tool was developed for non-sterile finished product contract manufacturers. A modified Delphi method was used with expert panel consisting of 14 experts from pharmaceutical industry, authorities and university. The content validity of the developed tool was assessed by a Delphi questionnaire round. The response rate in Delphi questionnaire round was 86%. The tool consisted of 103 quality items, from which 90 (87%) achieved the pre-defined agreement rate level (75%). Thirteen quality items which did not achieve the pre-defined agreement rate were excluded from the tool. The expert panel suggested only minor changes to the tool. The results show that the content validity of the developed audit preparation tool was good.

  2. An analytical study of electric vehicle handling dynamics

    NASA Technical Reports Server (NTRS)

    Greene, J. E.; Segal, D. J.

    1979-01-01

    Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.

  3. Defining health-related quality of life for young wheelchair users: A qualitative health economics study

    PubMed Central

    2017-01-01

    Background Wheelchairs for children with impaired mobility provide health, developmental and psychosocial benefits, however there is limited understanding of how mobility aids affect the health-related quality of life of children with impaired mobility. Preference-based health-related quality of life outcome measures are used to calculate quality-adjusted life years; an important concept in health economics. The aim of this research was to understand how young wheelchair users and their parents define health-related quality of life in relation to mobility impairment and wheelchair use. Methods The sampling frame was children with impaired mobility (≤18 years) who use a wheelchair and their parents. Data were collected through semi-structured face-to-face interviews conducted in participants’ homes. Qualitative framework analysis was used to analyse the interview transcripts. An a priori thematic coding framework was developed. Emerging codes were grouped into categories, and refined into analytical themes. The data were used to build an understanding of how children with impaired mobility define health-related quality of life in relation to mobility impairment, and to assess the applicability of two standard measures of health-related quality of life. Results Eleven children with impaired mobility and 24 parents were interviewed across 27 interviews. Participants defined mobility-related quality of life through three distinct but interrelated concepts: 1) participation and positive experiences; 2) self-worth and feeling fulfilled; 3) health and functioning. A good degree of consensus was found between child and parent responses, although there was some evidence to suggest a shift in perception of mobility-related quality of life with child age. Conclusions Young wheelchair users define health-related quality of life in a distinct way as a result of their mobility impairment and adaptation use. Generic, preference-based measures of health-related quality of life lack sensitivity in this population. Development of a mobility-related quality of life outcome measure for children is recommended. PMID:28617820

  4. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  5. Ace Project as a Project Management Tool

    ERIC Educational Resources Information Center

    Cline, Melinda; Guynes, Carl S.; Simard, Karine

    2010-01-01

    The primary challenge of project management is to achieve the project goals and objectives while adhering to project constraints--usually scope, quality, time and budget. The secondary challenge is to optimize the allocation and integration of resources necessary to meet pre-defined objectives. Project management software provides an active…

  6. Research needs and prioritizations for studies linking dietary sugars and potentially related health outcomes

    USDA-ARS?s Scientific Manuscript database

    An approach developed by the Agency for Healthcare Research and Quality (AHRQ) for assessing future research needs (FRN) regarding dietary sugars was implemented. A panel of 14 stakeholders across 7 pre-defined areas of expertise (lay audience, policy maker, health provider, research funder, evidenc...

  7. Assessment of infant formula quality and composition using Vis-NIR, MIR and Raman process analytical technologies.

    PubMed

    Wang, Xiao; Esquerre, Carlos; Downey, Gerard; Henihan, Lisa; O'Callaghan, Donal; O'Donnell, Colm

    2018-06-01

    In this study, visible and near-infrared (Vis-NIR), mid-infrared (MIR) and Raman process analytical technologies were investigated for assessment of infant formula quality and compositional parameters namely preheat temperature, storage temperature, storage time, fluorescence of advanced Maillard products and soluble tryptophan (FAST) index, soluble protein, fat and surface free fat (SFF) content. PLS-DA models developed using spectral data with appropriate data pre-treatment and significant variables selected using Martens' uncertainty test had good accuracy for the discrimination of preheat temperature (92.3-100%) and storage temperature (91.7-100%). The best PLS regression models developed yielded values for the ratio of prediction error to deviation (RPD) of 3.6-6.1, 2.1-2.7, 1.7-2.9, 1.6-2.6 and 2.5-3.0 for storage time, FAST index, soluble protein, fat and SFF content prediction respectively. Vis-NIR, MIR and Raman were demonstrated to be potential PAT tools for process control and quality assurance applications in infant formula and dairy ingredient manufacture. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Negotiating Identities for Mathematics Teaching in the Context of Professional Development

    ERIC Educational Resources Information Center

    Gresalfi, Melissa Sommerfeld; Cobb, Paul

    2011-01-01

    This article presents an analytical approach for documenting the identities for teaching that mathematics teachers negotiate as they participate in 2 or more communities that define high-quality teaching differently. Drawing on data from the first 2 years of a collaboration with a group of middle school mathematics teachers, the article focuses on…

  9. The U.S. Geological Survey's Sediment-bound Contaminant Resiliency and Response Strategy: A Tiered Multi-metric Approach to Environmental Health and Hazards in the Northeastern USA

    NASA Astrophysics Data System (ADS)

    Reilly, T. J.; Focazio, M. J.; Murdoch, P. S.; Benzel, W. M.; Fisher, S. C.; Griffin, D. W.; Iwanowicz, L. R.; Jones, D. K.; Loftin, K. A.

    2014-12-01

    Enhanced dispersion and concentration of contaminants such as trace metals and organic pollutants through storm-induced disturbances and sea level rise (SLR) are major factors that could adversely impact the health and resilience of communities and ecosystems in coming years. As part of the response to Hurricane Sandy, the U.S. Geological Survey collected data on the effects of contaminant source disturbance and dispersion. A major limitation of conducting pre- and post-Sandy comparisons was the lack of baseline data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where human and ecological exposures are probable. To address this limitation, a Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy with two operational modes, Resiliency (baseline) and Response (event-based), has been designed by leveraging existing interagency networks and resources. In Resiliency Mode, sites will be identified and sampled using standardized procedures prioritized to develop baseline data and to define sediment-quality based environmental health metrics. In Response Mode, a subset of sites within the network will be evaluated to ensure that adequate pre-event data exist at priority locations. If deficient, pre-event samples will be collected from priority locations. Crews will be deployed post-event to resample these locations allowing direct evaluation of impacts, as well as redefining baseline conditions for these areas. A tiered analytical and data integration strategy has been developed that will identify vulnerable human and environmental receptors, the sediment-bound contaminants present, and the biological activity and potential effects of exposure to characterized sediments. Communication mechanisms are in development to make resulting data available in a timely fashion and in a suitable format for informing event response and recovery efforts.

  10. Analysis of Environmental Contamination resulting from ...

    EPA Pesticide Factsheets

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu

  11. EC4 European Syllabus for Post-Graduate Training in Clinical Chemistry and Laboratory Medicine: version 3 - 2005.

    PubMed

    Zerah, Simone; McMurray, Janet; Bousquet, Bernard; Baum, Hannsjorg; Beastall, Graham H; Blaton, Vic; Cals, Marie-Josèphe; Duchassaing, Danielle; Gaudeau-Toussaint, Marie-Françoise; Harmoinen, Aimo; Hoffmann, Hans; Jansen, Rob T; Kenny, Desmond; Kohse, Klaus P; Köller, Ursula; Gobert, Jean-Gérard; Linget, Christine; Lund, Erik; Nubile, Giuseppe; Opp, Matthias; Pazzagli, Mario; Pinon, Georges; Queralto, José M; Reguengo, Henrique; Rizos, Demetrios; Szekeres, Thomas; Vidaud, Michel; Wallinder, Hans

    2006-01-01

    The EC4 Syllabus for Postgraduate Training is the basis for the European Register of Specialists in Clinical Chemistry and Laboratory Medicine. The syllabus: Indicates the level of requirements in postgraduate training to harmonise the postgraduate education in the European Union (EU); Indicates the level of content of national training programmes to obtain adequate knowledge and experience; Is approved by all EU societies for clinical chemistry and laboratory medicine. The syllabus is not primarily meant to be a training guide, but on the basis of the overview given (common minimal programme), national societies should formulate programmes that indicate where knowledge and experience is needed. The main points of this programme are: Indicates the level of requirements in postgraduate training to harmonise the postgraduate education in the European Union (EU); Indicates the level of content of national training programmes to obtain adequate knowledge and experience; Is approved by all EU societies for clinical chemistry and laboratory medicine. Knowledge in biochemistry, haematology, immunology, etc.; Pre-analytical conditions; Evaluation of results; Interpretations (post-analytical phase); Laboratory management; and Quality insurance management. The aim of this version of the syllabus is to be in accordance with the Directive of Professional Qualifications published on 30 September 2005. To prepare the common platforms planned in this directive, the disciplines are divided into four categories: Indicates the level of requirements in postgraduate training to harmonise the postgraduate education in the European Union (EU); Indicates the level of content of national training programmes to obtain adequate knowledge and experience; Is approved by all EU societies for clinical chemistry and laboratory medicine. Knowledge in biochemistry, haematology, immunology, etc.; Pre-analytical conditions; Evaluation of results; Interpretations (post-analytical phase); Laboratory management; and Quality insurance management. General chemistry, encompassing biochemistry, endocrinology, chemical (humoral), immunology, toxicology, and therapeutic drug monitoring; Haematology, covering cells, transfusion serology, coagulation, and cellular immunology; Microbiology, involving bacteriology, virology, parasitology, and mycology; Genetics and IVF.

  12. Improving quality in the preanalytical phase through innovation, on behalf of the European Federation for Clinical Chemistry and Laboratory Medicine (EFLM) Working Group for Preanalytical Phase (WG-PRE).

    PubMed

    Lippi, Giuseppe; Baird, Geoffrey S; Banfi, Giuseppe; Bölenius, Karin; Cadamuro, Janne; Church, Stephen; Cornes, Michael P; Dacey, Anna; Guillon, Antoine; Hoffmann, Georg; Nybo, Mads; Premawardhana, Lakdasa Devananda; Salinas, María; Sandberg, Sverre; Slingerland, Robbert; Stankovic, Ana; Sverresdotter, Sylte Marit; Vermeersch, Pieter; Simundic, Ana-Maria

    2017-03-01

    It is now undeniable that laboratory testing is vital for the diagnosis, prognostication and therapeutic monitoring of human disease. Despite the many advances made for achieving a high degree of quality and safety in the analytical part of diagnostic testing, many hurdles in the total testing process remain, especially in the preanalytical phase ranging from test ordering to obtaining and managing the biological specimens. The Working Group for the Preanalytical Phase (WG-PRE) of the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has planned many activities aimed at mitigating the vulnerability of the preanalytical phase, including the organization of three European meetings in the past 7 years. Hence, this collective article follows the previous three opinion papers that were published by the EFLM WGPRE on the same topic, and brings together the summaries of the presentations that will be given at the 4th EFLM-BD meeting "Improving quality in the preanalytical phase through innovation" in Amsterdam, 24-25 March, 2017.

  13. Numerical Analysis of a Flexible Dual Loop Coil and its Experimental Validation for pre-Clinical Magnetic Resonance Imaging of Rodents at 7 T

    NASA Astrophysics Data System (ADS)

    Solis-Najera, S.; Vazquez, F.; Hernandez, R.; Marrufo, O.; Rodriguez, A. O.

    2016-12-01

    A surface radio frequency coil was developed for small animal image acquisition in a pre-clinical magnetic resonance imaging system at 7 T. A flexible coil composed of two circular loops was developed to closely cover the object to be imaged. Electromagnetic numerical simulations were performed to evaluate its performance before the coil construction. An analytical expression of the mutual inductance for the two circular loops as a function of the separation between them was derived and used to validate the simulations. The RF coil is composed of two circular loops with a 5 cm external diameter and was tuned to 300 MHz and 50 Ohms matched. The angle between the loops was varied and the Q factor was obtained from the S11 simulations for each angle. B1 homogeneity was also evaluated using the electromagnetic simulations. The coil prototype was designed and built considering the numerical simulation results. To show the feasibility of the coil and its performance, saline-solution phantom images were acquired. A correlation of the simulations and imaging experimental results was conducted showing a concordance of 0.88 for the B1 field. The best coil performance was obtained at the 90° aperture angle. A more realistic phantom was also built using a formaldehyde-fixed rat phantom for ex vivo imaging experiments. All images showed a good image quality revealing clearly defined anatomical details of an ex vivo rat.

  14. Dysferlin rescue by spliceosome-mediated pre-mRNA trans-splicing targeting introns harbouring weakly defined 3' splice sites.

    PubMed

    Philippi, Susanne; Lorain, Stéphanie; Beley, Cyriaque; Peccate, Cécile; Précigout, Guillaume; Spuler, Simone; Garcia, Luis

    2015-07-15

    The modification of the pre-mRNA cis-splicing process employing a pre-mRNA trans-splicing molecule (PTM) is an attractive strategy for the in situ correction of genes whose careful transcription regulation and full-length expression is determinative for protein function, as it is the case for the dysferlin (DYSF, Dysf) gene. Loss-of-function mutations of DYSF result in different types of muscular dystrophy mainly manifesting as limb girdle muscular dystrophy 2B (LGMD2B) and Miyoshi muscular dystrophy 1 (MMD1). We established a 3' replacement strategy for mutated DYSF pre-mRNAs induced by spliceosome-mediated pre-mRNA trans-splicing (SmaRT) by the use of a PTM. In contrast to previously established SmaRT strategies, we particularly focused on the identification of a suitable pre-mRNA target intron other than the optimization of the PTM design. By targeting DYSF pre-mRNA introns harbouring differentially defined 3' splice sites (3' SS), we found that target introns encoding weakly defined 3' SSs were trans-spliced successfully in vitro in human LGMD2B myoblasts as well as in vivo in skeletal muscle of wild-type and Dysf(-/-) mice. For the first time, we demonstrate rescue of Dysf protein by SmaRT in vivo. Moreover, we identified concordant qualities among the successfully targeted Dysf introns and targeted endogenous introns in previously reported SmaRT approaches that might facilitate a selective choice of target introns in future SmaRT strategies. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. A quality assessment tool for markup-based clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  16. Novel bone metabolism-associated hormones: the importance of the pre-analytical phase for understanding their physiological roles.

    PubMed

    Lombardi, Giovanni; Barbaro, Mosè; Locatelli, Massimo; Banfi, Giuseppe

    2017-06-01

    The endocrine function of bone is now a recognized feature of this tissue. Bone-derived hormones that modulate whole-body homeostasis, are being discovered as for the effects on bone of novel and classic hormones produced by other tissues become known. Often, however, the data regarding these last generation bone-derived or bone-targeting hormones do not give about a clear picture of their physiological roles or concentration ranges. A certain degree of uncertainty could stem from differences in the pre-analytical management of biological samples. The pre-analytical phase comprises a series of decisions and actions (i.e., choice of sample matrix, methods of collection, transportation, treatment and storage) preceding analysis. Errors arising in this phase will inevitably be carried over to the analytical phase where they can reduce the measurement accuracy, ultimately, leading discrepant results. While the pre-analytical phase is all important, in routine laboratory medicine, it is often not given due consideration in research and clinical trials. This is particularly true for novel molecules, such as the hormones regulating the endocrine function of bone. In this review we discuss the importance of the pre-analytical variables affecting the measurement of last generation bone-associated hormones and describe their, often debated and rarely clear physiological roles.

  17. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general.

  18. Relationship between mathematical abstraction in learning parallel coordinates concept and performance in learning analytic geometry of pre-service mathematics teachers: an investigation

    NASA Astrophysics Data System (ADS)

    Nurhasanah, F.; Kusumah, Y. S.; Sabandar, J.; Suryadi, D.

    2018-05-01

    As one of the non-conventional mathematics concepts, Parallel Coordinates is potential to be learned by pre-service mathematics teachers in order to give them experiences in constructing richer schemes and doing abstraction process. Unfortunately, the study related to this issue is still limited. This study wants to answer a research question “to what extent the abstraction process of pre-service mathematics teachers in learning concept of Parallel Coordinates could indicate their performance in learning Analytic Geometry”. This is a case study that part of a larger study in examining mathematical abstraction of pre-service mathematics teachers in learning non-conventional mathematics concept. Descriptive statistics method is used in this study to analyze the scores from three different tests: Cartesian Coordinate, Parallel Coordinates, and Analytic Geometry. The participants in this study consist of 45 pre-service mathematics teachers. The result shows that there is a linear association between the score on Cartesian Coordinate and Parallel Coordinates. There also found that the higher levels of the abstraction process in learning Parallel Coordinates are linearly associated with higher student achievement in Analytic Geometry. The result of this study shows that the concept of Parallel Coordinates has a significant role for pre-service mathematics teachers in learning Analytic Geometry.

  19. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  20. Population-Based Pediatric Reference Intervals in General Clinical Chemistry: A Swedish Survey.

    PubMed

    Ridefelt, Peter

    2015-01-01

    Very few high quality studies on pediatric reference intervals for general clinical chemistry and hematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The Swedish survey included 701 healthy children. Reference intervals for general clinical chemistry and hematology were defined.

  1. Illustrating the Steady-State Condition and the Single-Molecule Kinetic Method with the NMDA Receptor

    ERIC Educational Resources Information Center

    Kosman, Daniel J.

    2009-01-01

    The steady-state is a fundamental aspect of biochemical pathways in cells; indeed, the concept of steady-state is a definition of life itself. In a simple enzyme kinetic scheme, the steady-state condition is easy to define analytically but experimentally often difficult to capture because of its evanescent quality; the initial, constant velocity…

  2. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations

    PubMed Central

    Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge. PMID:27313604

  3. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations.

    PubMed

    Castelli, Mauro; Manzoni, Luca; Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge.

  4. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  5. The evolution of analytical chemistry methods in foodomics.

    PubMed

    Gallo, Monica; Ferranti, Pasquale

    2016-01-08

    The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Promoting clinical and laboratory interaction by harmonization.

    PubMed

    Plebani, Mario; Panteghini, Mauro

    2014-05-15

    The lack of interchangeable results in current practice among clinical laboratories has underpinned greater attention to standardization and harmonization projects. Although the focus was mainly on the standardization and harmonization of measurement procedures and their results, the scope of harmonization goes beyond method and analytical results: it includes all other aspects of laboratory testing, including terminology and units, report formats, reference limits and decision thresholds, as well as test profiles and criteria for the interpretation of results. In particular, as evidence collected in last decades demonstrates that pre-pre- and post-post-analytical steps are more vulnerable to errors, harmonization initiatives should be performed to improve procedures and processes at the laboratory-clinical interface. Managing upstream demand, down-stream interpretation of laboratory results, and subsequent appropriate action through close relationships between laboratorians and clinicians remains a crucial issue of the laboratory testing process. Therefore, initiatives to improve test demand management from one hand and to harmonize procedures to improve physicians' acknowledgment of laboratory data and their interpretation from the other hand are needed in order to assure quality and safety in the total testing process. © 2013.

  7. The Flipped Classroom: Fertile Ground for Nursing Education Research.

    PubMed

    Bernard, Jean S

    2015-07-16

    In the flipped classroom (FC) students view pre-recorded lectures or complete pre-class assignments to learn foundational concepts. Class time involves problem-solving and application activities that cultivate higher-level cognitive skills. A systematic, analytical literature review was conducted to explore the FC's current state of the science within higher education. Examination of this model's definition and measures of student performance, student and faculty perceptions revealed an ill-defined educational approach. Few studies confirmed FC effectiveness; many lacked rigorous design, randomized samples, or control of extraneous variables. Few researchers conducted longitudinal studies to determine sufficiently trends related to FC practice. This study proves relevant to nurse educators transitioning from traditional teaching paradigms to learner-centered models, and provides insight from faculty teaching across disciplines around the world. It reveals pertinent findings and identifies current knowledge gaps that call for further inquiry.

  8. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  9. Quantitative assessment of prevalence of pre-analytical variables and their effect on coagulation assay. Can intervention improve patient safety?

    PubMed

    Bhushan, Ravi; Sen, Arijit

    2017-04-01

    Very few Indian studies exist on evaluation of pre-analytical variables affecting "Prothrombin Time" the commonest coagulation assay performed. The study was performed in an Indian tertiary care setting with an aim to assess quantitatively the prevalence of pre-analytical variables and their effects on the results (patient safety), for Prothrombin time test. The study also evaluated their effects on the result and whether intervention, did correct the results. The firstly evaluated the prevalence for various pre-analytical variables detected in samples sent for Prothrombin Time testing. These samples with the detected variables wherever possible were tested and result noted. The samples from the same patients were repeated and retested ensuring that no pre-analytical variable is present. The results were again noted to check for difference the intervention produced. The study evaluated 9989 samples received for PT/INR over a period of 18 months. The prevalence of different pre-analytical variables was found to be 862 (8.63%). The proportion of various pre-analytical variables detected were haemolysed samples 515 (5.16%), over filled vacutainers 62 (0.62%), under filled vacutainers 39 (0.39%), low values 205 (2.05%), clotted samples 11 (0.11%), wrong labeling 4 (0.04%), wrong vacutainer use 2 (0.02%), chylous samples 7 (0.07%) and samples with more than one variable 17 (0.17%). The comparison of percentage of samples showing errors were noted for the first variables since they could be tested with and without the variable in place. The reduction in error percentage was 91.5%, 69.2%, 81.5% and 95.4% post intervention for haemolysed, overfilled, under filled and samples collected with excess pressure at phlebotomy respectively. Correcting the variables did reduce the error percentage to a great extent in these four variables and hence the variables are found to affect "Prothrombin Time" testing and can hamper patient safety.

  10. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  11. The effects of total laboratory automation on the management of a clinical chemistry laboratory. Retrospective analysis of 36 years.

    PubMed

    Sarkozi, Laszlo; Simson, Elkin; Ramanathan, Lakshmi

    2003-03-01

    Thirty-six years of data and history of laboratory practice at our institution has enabled us to follow the effects of analytical automation, then recently pre-analytical and post-analytical automation on productivity, cost reduction and enhanced quality of service. In 1998, we began the operation of a pre- and post-analytical automation system (robotics), together with an advanced laboratory information system to process specimens prior to analysis, deliver them to various automated analytical instruments, specimen outlet racks and finally to refrigerated stockyards. By the end of 3 years of continuous operation, we compared the chemistry part of the system with the prior 33 years and quantitated the financial impact of the various stages of automation. Between 1965 and 2000, the Consumer Price Index increased by a factor of 5.5 in the United States. During the same 36 years, at our institution's Chemistry Department the productivity (indicated as the number of reported test results/employee/year) increased from 10,600 to 104,558 (9.3-fold). When expressed in constant 1965 dollars, the total cost per test decreased from 0.79 dollars to 0.15 dollars. Turnaround time for availability of results on patient units decreased to the extent that Stat specimens requiring a turnaround time of <1 h do not need to be separately prepared or prioritized on the system. Our experience shows that the introduction of a robotics system for perianalytical automation has brought a large improvement in productivity together with decreased operational cost. It enabled us to significantly increase our workload together with a reduction of personnel. In addition, stats are handled easily and there are benefits such as safer working conditions and improved sample identification, which are difficult to quantify at this stage.

  12. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  13. TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswaamy, G; Morrow, A; Kim, S

    Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less

  14. Video on Diet Before Outpatient Colonoscopy Does Not Improve Quality of Bowel Preparation: A Prospective, Randomized, Controlled Trial.

    PubMed

    Rice, Sean C; Higginbotham, Tina; Dean, Melanie J; Slaughter, James C; Yachimski, Patrick S; Obstein, Keith L

    2016-11-01

    Successful outpatient colonoscopy (CLS) depends on many factors including the quality of a patient's bowel preparation. Although education on consumption of the pre-CLS purgative can improve bowel preparation quality, no study has evaluated dietary education alone. We have created an educational video on pre-CLS dietary instructions to determine whether dietary education would improve outpatient bowel preparation quality. A prospective randomized, blinded, controlled study of patients undergoing outpatient CLS was performed. All patients received a 4 l polyethylene glycol-based split-dose bowel preparation and standard institutional pre-procedure instructions. Patients were then randomly assigned to an intervention arm or to a no intervention arm. A 4-min educational video detailing clear liquid diet restriction was made available to patients in the intervention arm, whereas those randomized to no intervention did not have access to the video. Patients randomized to the video were provided with the YouTube video link 48-72 h before CLS. An attending endoscopist blinded to randomization performed the CLS. Bowel preparation quality was scored using the Boston Bowel Preparation Scale (BBPS). Adequate preparation was defined as a BBPS total score of ≥6 with all segment scores ≥2. Wilcoxon rank-sum and Pearson's χ 2 -tests were performed to assess differences between groups. Ninety-two patients were randomized (video: n=42; control: n=50) with 47 total video views being tallied. There were no demographic differences between groups. There was no statistically significant difference in adequate preparation between groups (video=74%; control=68%; P=0.54). The availability of a supplementary patient educational video on clear liquid diet alone was insufficient to improve bowel preparation quality when compared with standard pre-procedure instruction at our institution.

  15. Kinematic analysis of the gait of adult sheep during treadmill locomotion: Parameter values, allowable total error, and potential for use in evaluating spinal cord injury.

    PubMed

    Safayi, Sina; Jeffery, Nick D; Shivapour, Sara K; Zamanighomi, Mahdi; Zylstra, Tyler J; Bratsch-Prince, Joshua; Wilson, Saul; Reddy, Chandan G; Fredericks, Douglas C; Gillies, George T; Howard, Matthew A

    2015-11-15

    We are developing a novel intradural spinal cord (SC) stimulator designed to improve the treatment of intractable pain and the sequelae of SC injury. In-vivo ovine models of neuropathic pain and moderate SC injury are being implemented for pre-clinical evaluations of this device, to be carried out via gait analysis before and after induction of the relevant condition. We extend previous studies on other quadrupeds to extract the three-dimensional kinematics of the limbs over the gait cycle of sheep walking on a treadmill. Quantitative measures of thoracic and pelvic limb movements were obtained from 17 animals. We calculated the total-error values to define the analytical performance of our motion capture system for these kinematic variables. The post- vs. pre-injury time delay between contralateral thoracic and pelvic-limb steps for normal and SC-injured sheep increased by ~24s over 100 steps. The pelvic limb hoof velocity during swing phase decreased, while range of pelvic hoof elevation and distance between lateral pelvic hoof placements increased after SC injury. The kinematics measures in a single SC-injured sheep can be objectively defined as changed from the corresponding pre-injury values, implying utility of this method to assess new neuromodulation strategies for specific deficits exhibited by an individual. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Model-Based Extracted Water Desalination System for Carbon Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dees, Elizabeth M.; Moore, David Roger; Li, Li

    Over the last 1.5 years, GE Global Research and Pennsylvania State University defined a model-based, scalable, and multi-stage extracted water desalination system that yields clean water, concentrated brine, and, optionally, salt. The team explored saline brines that ranged across the expected range for extracted water for carbon sequestration reservoirs (40,000 up to 220,000 ppm total dissolved solids, TDS). In addition, the validated the system performance at pilot scale with field-sourced water using GE’s pre-pilot and lab facilities. This project encompassed four principal tasks, in addition to Project Management and Planning: 1) identify a deep saline formation carbon sequestration site andmore » a partner that are suitable for supplying extracted water; 2) conduct a techno-economic assessment and down-selection of pre-treatment and desalination technologies to identify a cost-effective system for extracted water recovery; 3) validate the downselected processes at the lab/pre-pilot scale; and 4) define the scope of the pilot desalination project. Highlights from each task are described below: Deep saline formation characterization The deep saline formations associated with the five DOE NETL 1260 Phase 1 projects were characterized with respect to their mineralogy and formation water composition. Sources of high TDS feed water other than extracted water were explored for high TDS desalination applications, including unconventional oil and gas and seawater reverse osmosis concentrate. Technoeconomic analysis of desalination technologies Techno-economic evaluations of alternate brine concentration technologies, including humidification-dehumidification (HDH), membrane distillation (MD), forward osmosis (FO), turboexpander-freeze, solvent extraction and high pressure reverse osmosis (HPRO), were conducted. These technologies were evaluated against conventional falling film-mechanical vapor recompression (FF-MVR) as a baseline desalination process. Furthermore, a quality function deployment (QFD) method was used to compare alternate high TDS desalination technologies to FF-MVR. High pressure reverse osmosis was found to a be a promising alternative desalination technology. A deep-dive technoeconomic analysis of HPRO was performed, including Capex and Opex estimates, for seawater RO (SWRO). Additionally, two additional cases were explored: 1) a comparison of a SWRO plus HPRO system to the option of doubling the size of a standard seawater RO system to achieve the same total pure water recovery rate; and 2) a flue gas desulfurization wastewater treatment zero-liquid discharge (ZLD) application, where preconcentration with RO (SWRO or SWRO + HPRO) before evaporation and crystallization was compared to FF-MVR and crystallization technologies without RO preconcentration. Pre-pilot process validation Pre-pilot-scale tests were conducted using field production water to validate key process steps for extracted water pretreatment. Approximately 5,000 gallons of field produced water was processed through, microfiltration, ultrafiltration, and steam regenerable sorbent operations. Smaller quantities were processed through microclarification. In addition, analytical methods (purge-and-trap gas chromatography and Hach TOC analytical methods) were validated. Lab-scale HPRO elements were constructed and tested at high pressures, to identify and mitigate technical risks of the technology. Lastly, improvements in RO membrane materials were identified as the necessary next step to achieve further improvement in element performance at high pressure. Scope of Field Pilot A field pilot for extracted water pretreatment was designed.« less

  17. A quality improvement project sustainably decreased time to onset of active physical therapy intervention in patients with acute lung injury.

    PubMed

    Dinglas, Victor D; Parker, Ann M; Reddy, Dereddi Raja S; Colantuoni, Elizabeth; Zanni, Jennifer M; Turnbull, Alison E; Nelliot, Archana; Ciesla, Nancy; Needham, Dale M

    2014-10-01

    Rehabilitation started early during an intensive care unit (ICU) stay is associated with improved outcomes and is the basis for many quality improvement (QI) projects showing important changes in practice. However, little evidence exists regarding whether such changes are sustainable in real-world practice. To evaluate the sustained effect of a quality improvement project on the timing of initiation of active physical therapy intervention in patients with acute lung injury (ALI). This was a pre-post evaluation using prospectively collected data involving consecutive patients with ALI admitted pre-quality improvement (October 2004-April 2007, n = 120) versus post-quality improvement (July 2009-July 2012, n = 123) from a single medical ICU. The primary outcome was time to first active physical therapy intervention, defined as strengthening, mobility, or cycle ergometry exercises. Among ICU survivors, more patients in the post-quality improvement versus pre-quality improvement group received physical therapy in the ICU (89% vs. 24%, P < 0.001) and were able to stand, transfer, or ambulate during physical therapy in the ICU (64% vs. 7%, P < 0.001). Among all patients in the post-quality improvement versus pre-quality improvement group, there was a shorter median (interquartile range) time to first physical therapy (4 [2, 6] vs. 11 d [6, 29], P < 0.001) and a greater median (interquartile range) proportion of ICU days with physical therapy after initiation (50% [33, 67%] vs. 18% [4, 47%], P = 0.003). In multivariable regression analysis, the post-quality improvement period was associated with shorter time to physical therapy (adjusted hazard ratio [95% confidence interval], 8.38 [4.98, 14.11], P < 0.001), with this association significant for each of the 5 years during the post-quality improvement period. The following variables were independently associated with a longer time to physical therapy: higher Sequential Organ Failure Assessment score (0.93 [0.89, 0.97]), higher FiO2 (0.86 [0.75, 0.99] for each 10% increase), use of an opioid infusion (0.47 [0.25, 0.89]), and deep sedation (0.24 [0.12, 0.46]). In this single-site, pre-post analysis of patients with ALI, an early rehabilitation quality improvement project was independently associated with a substantial decrease in the time to initiation of active physical therapy intervention that was sustained over 5 years. Over the entire pre-post period, severity of illness and sedation were independently associated with a longer time to initiation of active physical therapy intervention in the ICU.

  18. Multi-centre audit of VMAT planning and pre-treatment verification.

    PubMed

    Jurado-Bruggeman, Diego; Hernández, Victor; Sáez, Jordi; Navarro, David; Pino, Francisco; Martínez, Tatiana; Alayrach, Maria-Elena; Ailleres, Norbert; Melero, Alejandro; Jornet, Núria

    2017-08-01

    We performed a multi-centre intercomparison of VMAT dose planning and pre-treatment verification. The aims were to analyse the dose plans in terms of dosimetric quality and deliverability, and to validate whether in-house pre-treatment verification results agreed with those of an external audit. The nine participating centres encompassed different machines, equipment, and methodologies. Two mock cases (prostate and head and neck) were planned using one and two arcs. A plan quality index was defined to compare the plans and different complexity indices were calculated to check their deliverability. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit (global 3D gamma, absolute dose differences, 10% of maximum dose threshold). Log-file analysis was performed to look for delivery errors. All centres fulfilled the dosimetric goals but plan quality and delivery complexity were heterogeneous and uncorrelated, depending on the manufacturer and the planner's methodology. Pre-treatment verifications results were within tolerance in all cases for gamma 3%-3mm evaluation. Nevertheless, differences between the external audit and in-house measurements arose due to different equipment or methodology, especially for 2%-2mm criteria with differences up to 20%. No correlation was found between complexity indices and verification results amongst centres. All plans fulfilled dosimetric constraints, but plan quality and complexity did not correlate and were strongly dependent on the planner and the vendor. In-house measurements cannot completely replace external audits for credentialing. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Request Pattern, Pre-Analytical and Analytical Conditions of Urinalysis in Primary Care: Lessons from a One-Year Large-Scale Multicenter Study.

    PubMed

    Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos

    2018-06-01

    To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.

  20. Strategies to define performance specifications in laboratory medicine: 3 years on from the Milan Strategic Conference.

    PubMed

    Panteghini, Mauro; Ceriotti, Ferruccio; Jones, Graham; Oosterhuis, Wytze; Plebani, Mario; Sandberg, Sverre

    2017-10-26

    Measurements in clinical laboratories produce results needed in the diagnosis and monitoring of patients. These results are always characterized by some uncertainty. What quality is needed and what measurement errors can be tolerated without jeopardizing patient safety should therefore be defined and specified for each analyte having clinical use. When these specifications are defined, the total examination process will be "fit for purpose" and the laboratory professionals should then set up rules to control the measuring systems to ensure they perform within specifications. The laboratory community has used different models to set performance specifications (PS). Recently, it was felt that there was a need to revisit different models and, at the same time, to emphasize the presuppositions for using the different models. Therefore, in 2014 the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) organized a Strategic Conference in Milan. It was felt that there was a need for more detailed discussions on, for instance, PS for EQAS, which measurands should use which models to set PS and how to set PS for the extra-analytical phases. There was also a need to critically evaluate the quality of data on biological variation studies and further discussing the use of the total error (TE) concept. Consequently, EFLM established five Task Finish Groups (TFGs) to address each of these topics. The TFGs are finishing their activity on 2017 and the content of this paper includes deliverables from these groups.

  1. An experimental study to evaluate the technological limitations in the understanding of the haemodynamic change in pre-eclampsia.

    PubMed

    Sengupta

    1998-08-01

    BACKGROUND: Conventional indices could not define the pathogenesis of pre-eclampsia and its predictability. It has also not been possible to record these indices from the local uteroplacental system where the pathology lies. OBJECTIVE: To investigate the limitations of the currently available blood pressure-flow measuring indices and techniques commonly used in pregnancy.METHOD: Blood pressure and velocity profiles were obtained under various pathophysiological conditions for pregnant and non-pregnant animals and human subjects. The data were analysed using both conventional and computer-based spectral methods. RESULTS: Continuous monitoring of blood pressure and velocity together with their spectral analysis appeared to be a useful sensitive indicator in pregnancy beyond the commonly available conventional analytical method. In high-resistance flow such as in hypertension and in pre-eclampsia, the power amplitude was relatively low at low frequency. Power amplitude remained high at low frequency in normal low-resistance state of pregnancy. CONCLUSION: The results suggest the need to develop a highly sensitive instrumentation whereby any minute variation in mean arterial pressure that is of clinical significance can be measured. Alternatively, analytical advancement, such as use of power spectrum analysers, might prove to be useful and sensitive. Variability of heart rate is an important determinant of the underlying pathophysiology in pregnancy. It is concluded that the heart rate of pre-eclamptics and hypertensives has to increase in order to maintain a constant organic blood flow whereas in normal pregnancy bloow flow can rise even without an incrase in heart rate. Future research should be directed towards blood flow mapping, power spectral analysis and image processing of the blood pressure-flow profile obtained from local and systemic compartments under different pathophysiological conditions of pregnancy.

  2. Liquid chromatography-mass spectrometry in metabolomics research: mass analyzers in ultra high pressure liquid chromatography coupling.

    PubMed

    Forcisi, Sara; Moritz, Franco; Kanawati, Basem; Tziotis, Dimitrios; Lehmann, Rainer; Schmitt-Kopplin, Philippe

    2013-05-31

    The present review gives an introduction into the concept of metabolomics and provides an overview of the analytical tools applied in non-targeted metabolomics with a focus on liquid chromatography (LC). LC is a powerful analytical tool in the study of complex sample matrices. A further development and configuration employing Ultra-High Pressure Liquid Chromatography (UHPLC) is optimized to provide the largest known liquid chromatographic resolution and peak capacity. Reasonably UHPLC plays an important role in separation and consequent metabolite identification of complex molecular mixtures such as bio-fluids. The most sensitive detectors for these purposes are mass spectrometers. Almost any mass analyzer can be optimized to identify and quantify small pre-defined sets of targets; however, the number of analytes in metabolomics is far greater. Optimized protocols for quantification of large sets of targets may be rendered inapplicable. Results on small target set analyses on different sample matrices are easily comparable with each other. In non-targeted metabolomics there is almost no analytical method which is applicable to all different matrices due to limitations pertaining to mass analyzers and chromatographic tools. The specifications of the most important interfaces and mass analyzers are discussed. We additionally provide an exemplary application in order to demonstrate the level of complexity which remains intractable up to date. The potential of coupling a high field Fourier Transform Ion Cyclotron Resonance Mass Spectrometer (ICR-FT/MS), the mass analyzer with the largest known mass resolving power, to UHPLC is given with an example of one human pre-treated plasma sample. This experimental example illustrates one way of overcoming the necessity of faster scanning rates in the coupling with UHPLC. The experiment enabled the extraction of thousands of features (analytical signals). A small subset of this compositional space could be mapped into a mass difference network whose topology shows specificity toward putative metabolite classes and retention time. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Design, Implementation, and Operational Methodologies for Sub-arcsecond Attitude Determination, Control, and Stabilization of the Super-pressure Balloon-Borne Imaging Telescope (SuperBIT)

    NASA Astrophysics Data System (ADS)

    Javier Romualdez, Luis

    Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.

  4. [A predictive model for the quality of sexual life in hysterectomized women].

    PubMed

    Urrutia, María Teresa; Araya, Alejandra; Rivera, Soledad; Viviani, Paola; Villarroel, Luis

    2007-03-01

    The effects of hysterectomy on sexuality has been extensively studied. To establish a model to predict the quality of sexual life in hysterectomized women, six months after surgery. Analytical, longitudinal and prospective study of 90 hysterectomized women aged 45+/-7 years. Two structured interviews at the time of surgery and six months later were carried out to determine the characteristics of sexuality and communication within the couple. In the two interviews, communication and the quality of sexual life were described as "good" in 72 and 77% of women, respectively (NS). The variables that had a 40% influence on the quality of sexual life sixth months after surgery, were oophorectomy status, the presence of orgasm, the characteristics of communication and the basal sexuality with the couple. The sexuality of the hysterectomized women will depend, on a great extent, of pre-surgical variables. Therefore, it is important to consider these variables for the education of hysterectomized women.

  5. Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).

    PubMed

    Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D

    2017-01-01

    Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.

  6. RNA quality in fresh-frozen gastrointestinal tumor specimens-experiences from the tumor and healthy tissue bank TU Dresden.

    PubMed

    Zeugner, Silke; Mayr, Thomas; Zietz, Christian; Aust, Daniela E; Baretton, Gustavo B

    2015-01-01

    The term "pre-analytics" summarizes all procedures concerned with specimen collection or processing as well as logistical aspects like transport or storage of tissue specimens. All or these variables as well as tissue-specific characteristics affect sample quality. While certain parameters like warm ischemia or tissue-specific characteristics cannot be changed, other parameters can be assessed and optimized. The aim of this study was to determine RNA quality by assessing the RIN values of specimens from different organs and to assess the influence of vacuum preservation. Samples from the GI tract, in general, appear to have lower RNA quality when compared to samples from other organ sites. This may be due to the digestive enzymes or bacterial colonization. Processing time in pathology does not significantly influence RNA quality. Tissue preservation with a vacuum sealer leads to preserved RNA quality over an extended period of time and offers a feasible alternative to minimize the influence of transport time into pathology.

  7. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of pesticides in water by Carbopak-B solid-phase extraction and high-preformance liquid chromatography

    USGS Publications Warehouse

    Werner, Stephen L.; Burkhardt, Mark R.; DeRusseau, Sabrina N.

    1996-01-01

    In accordance with the needs of the National Water-Quality Assessment Program (NAWQA), the U.S. Geological Survey has developed and implemented a graphitized carbon-based solid-phase extraction and high-performance liquid chromatographic analytical method. The method is used to determine 41 pesticides and pesticide metabolites that are not readily amenable to gas chromatography or other high-temperature analytical techniques. Pesticides are extracted from filtered environmental water samples using a 0.5-gram graphitized carbon-based solid-phase cartridge, eluted from the cartridge into two analytical fractions, and analyzed using high-performance liquid chromatography with photodiode-array detection. The upper concentration limit is 1.6 micrograms per liter (=B5g/L) for most compounds. Single-operator method detection limits in organic-free water samples ranged from 0.006 to 0.032 =B5g/L= Recoveries in organic-free water samples ranged from 37 to 88 percent. Recoveries in ground- and surface-water samples ranged from 29 to 94 percent. An optional on-site extraction procedure allows for samples to be collected and processed at remote sites where it is difficult to ship samples to the laboratory within the recommended pre-extraction holding time of 7 days.

  8. Cytological preparations for molecular analysis: A review of technical procedures, advantages and limitations for referring samples for testing.

    PubMed

    da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P

    2018-04-01

    Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.

  9. Two methods for proteomic analysis of formalin-fixed, paraffin embedded tissue result in differential protein identification, data quality, and cost.

    PubMed

    Luebker, Stephen A; Wojtkiewicz, Melinda; Koepsell, Scott A

    2015-11-01

    Formalin-fixed paraffin-embedded (FFPE) tissue is a rich source of clinically relevant material that can yield important translational biomarker discovery using proteomic analysis. Protocols for analyzing FFPE tissue by LC-MS/MS exist, but standardization of procedures and critical analysis of data quality is limited. This study compared and characterized data obtained from FFPE tissue using two methods: a urea in-solution digestion method (UISD) versus a commercially available Qproteome FFPE Tissue Kit method (Qkit). Each method was performed independently three times on serial sections of homogenous FFPE tissue to minimize pre-analytical variations and analyzed with three technical replicates by LC-MS/MS. Data were evaluated for reproducibility and physiochemical distribution, which highlighted differences in the ability of each method to identify proteins of different molecular weights and isoelectric points. Each method replicate resulted in a significant number of new protein identifications, and both methods identified significantly more proteins using three technical replicates as compared to only two. UISD was cheaper, required less time, and introduced significant protein modifications as compared to the Qkit method, which provided more precise and higher protein yields. These data highlight significant variability among method replicates and type of method used, despite minimizing pre-analytical variability. Utilization of only one method or too few replicates (both method and technical) may limit the subset of proteomic information obtained. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Sexual Knowledge and Attitude among Girls Who are Getting Married Based on the Information from Yas Pre-marriage Counseling Center.

    PubMed

    Baghersad, Zahra; Fahami, Fariba; Beigi, Marjan; Hasanzadeh, Akbar

    2017-01-01

    High prevalence of sexual dysfunction results from inadequate knowledge or inappropriate attitude toward the natural phenomenon of sexual desire. This study aimed to define sexual knowledge and attitude among girls who were getting married and referred to Yas pre-marriage counseling center. This research was a descriptive analytical study. The information of 165 girls, who were about to get married, were collected through convenient sampling using a researcher-made questionnaire. Data were analyzed using SPSS version 16 software. Inferential statistical method and Pearson correlation were used for data analysis. Results showed that the mean scores of sexual knowledge and attitude among the participants were 57.42 and 69.02, respectively. There was a significant association between the mean scores of sexual knowledge and sexual attitude ( P < 0.001, r = 0.63). Results showed that the participants had relatively appropriate knowledge and attitude toward sexual relationship.

  11. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  12. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  13. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  14. Applying Data Mining Techniques to Chemical Analyses of Pre-drill Groundwater Samples within the Marcellus Formation Shale Play in Bradford County, Pennsylvania

    NASA Astrophysics Data System (ADS)

    Wen, T.; Niu, X.; Gonzales, M. S.; Li, Z.; Brantley, S.

    2017-12-01

    Groundwater samples are collected for chemical analyses by shale gas industry consultants in the vicinity of proposed gas wells in Pennsylvania. These data sets are archived so that the chemistry of water from homeowner wells can be compared to chemistry after gas-well drilling. Improved public awareness of groundwater quality issues will contribute to designing strategies for both water resource management and hydrocarbon exploration. We have received water analyses for 11,000 groundwater samples from PA Department of Environmental Protection (PA DEP) in the Marcellus Shale footprint in Bradford County, PA for the years ranging from 2010 to 2016. The PA DEP has investigated these analyses to determine whether gas well drilling or other activities affected water quality. We are currently investigating these analyses to look for patterns in chemistry throughout the study area (related or unrelated to gas drilling activities) and to look for evidence of analytes that may be present at concentrations higher than the advised standards for drinking water. Our preliminary results reveal that dissolved methane concentrations tend to be higher along fault lines in Bradford County [1]. Lead (Pb), arsenic (As), and barium (Ba) are sometimes present at levels above the EPA maximum contaminant level (MCL). Iron (Fe) and manganese (Mn) more frequently violate the EPA standard. We find that concentrations of some chemical analytes (e.g., Ba and Mn) are dependent on bedrock formations (i.e., Catskill vs. Lock Haven) while concentrations of other analytes (e.g., Pb) are not statistically significantly distinct between different bedrock formations. Our investigations are also focused on looking for correlations that might explain water quality patterns with respect to human activities such as gas drilling. However, percentages of water samples failing EPA MCL with respect to Pb, As, and Ba have decreased from previous USGS and PSU studies in the 1990s and 2000s. Public access to pre-drill datasets such as the one we are investigating will allow better understanding of the controls on ground water chemistry, i.e., natural and anthropogenic impacts. [1] Li et al. (2016) Journal of Contaminant Hydrology 195, 23-30.

  15. Variation of organic matter quantity and quality in streams at Critical Zone Observatory watersheds

    USGS Publications Warehouse

    Miller, Matthew P.; Boyer, Elizabeth W.; McKnight, Diane M.; Brown, Michael G.; Gabor, Rachel S.; Hunsaker, Carolyn T.; Iavorivska , Lidiia; Inamdar, Shreeram; Kaplan, Louis A.; Johnson, Dale W.; Lin, Henry; McDowell, William H.; Perdrial, Julia N.

    2016-01-01

    The quantity and chemical composition of dissolved organic matter (DOM) in surface waters influence ecosystem processes and anthropogenic use of freshwater. However, despite the importance of understanding spatial and temporal patterns in DOM, measures of DOM quality are not routinely included as part of large-scale ecosystem monitoring programs and variations in analytical procedures can introduce artifacts. In this study, we used consistent sampling and analytical methods to meet the objective of defining variability in DOM quantity and quality and other measures of water quality in streamflow issuing from small forested watersheds located within five Critical Zone Observatory sites representing contrasting environmental conditions. Results show distinct separations among sites as a function of water quality constituents. Relationships among rates of atmospheric deposition, water quality conditions, and stream DOM quantity and quality are consistent with the notion that areas with relatively high rates of atmospheric nitrogen and sulfur deposition and high concentrations of divalent cations result in selective transport of DOM derived from microbial sources, including in-stream microbial phototrophs. We suggest that the critical zone as a whole strongly influences the origin, composition, and fate of DOM in streams. This study highlights the value of consistent DOM characterization methods included as part of long-term monitoring programs for improving our understanding of interactions among ecosystem processes as controls on DOM biogeochemistry.

  16. The Quality of Rare Disease Registries: Evaluation and Characterization.

    PubMed

    Coi, Alessio; Santoro, Michele; Villaverde-Hueso, Ana; Lipucci Di Paola, Michele; Gainotti, Sabina; Taruscio, Domenica; Posada de la Paz, Manuel; Bianchi, Fabrizio

    2016-01-01

    The focus on the quality of the procedures for data collection, storing, and analysis in the definition and implementation of a rare disease registry (RDR) is the basis for developing a valid and long-term sustainable tool. The aim of this study was to provide useful information for characterizing a quality profile for RDRs using an analytical approach applied to RDRs participating in the European Platform for Rare Disease Registries 2011-2014 (EPIRARE) survey. An indicator of quality was defined by choosing a small set of quality-related variables derived from the survey. The random forest method was used to identify the variables best defining a quality profile for RDRs. Fisher's exact test was employed to assess the association with the indicator of quality, and the Cochran-Armitage test was used to check the presence of a linear trend along different levels of quality. The set of variables found to characterize high-quality RDRs focused on ethical and legal issues, governance, communication of activities and results, established procedures to regulate access to data and security, and established plans to ensure long-term sustainability. The quality of RDRs is usually associated with a good oversight and governance mechanism and with durable funding. The results suggest that RDRs would benefit from support in management, information technology, epidemiology, and statistics. © 2016 S. Karger AG, Basel.

  17. Defining Pre-Katrina New Orleans: The Structural Transformation of Public Education in New Orleans and Historical Memory

    ERIC Educational Resources Information Center

    Boselovic, Joseph L.

    2014-01-01

    Although considerable work has been done around the supposed successes and failures of education reform in post-Katrina New Orleans, concerns about the public/private qualities of new policies are often not discussed explicitly. In kind, this article serves to investigate theoretical conceptions of the public as they relate to education while…

  18. Practical guidelines for the characterization and quality control of pure drug nanoparticles and nano-cocrystals in the pharmaceutical industry.

    PubMed

    Peltonen, Leena

    2018-06-16

    The number of poorly soluble drug candidates is increasing, and this is also seen in the research interest towards drug nanoparticles and (nano-)cocrystals; improved solubility is the most important application of these nanosystems. In order to confirm the functionality of these nanoparticles throughout their lifecycle, repeatability of the formulation processes, functional performance of the formed systems in pre-determined way and system stability, a thorough physicochemical understanding with the aid of necessary analytical techniques is needed. Even very minor deviations in for example particle size or size deviation in nanoscale can alter the product bioavailability, and the effect is even more dramatic with the smallest particle size fractions. Also, small particle size sets special requirements for the analytical techniques. In this review most important physicochemical properties of drug nanocrystals and nano-cocrystals are presented, suitable analytical techniques, their pros and cons, are described with the extra input on practical point of view. Copyright © 2018. Published by Elsevier B.V.

  19. Computer simulations for bioequivalence trials: Selection of analyte in BCS class II and IV drugs with first-pass metabolism, two metabolic pathways and intestinal efflux transporter.

    PubMed

    Mangas-Sanjuan, Victor; Navarro-Fontestad, Carmen; García-Arieta, Alfredo; Trocóniz, Iñaki F; Bermejo, Marival

    2018-05-30

    A semi-physiological two compartment pharmacokinetic model with two active metabolites (primary (PM) and secondary metabolites (SM)) with saturable and non-saturable pre-systemic efflux transporter, intestinal and hepatic metabolism has been developed. The aim of this work is to explore in several scenarios which analyte (parent drug or any of the metabolites) is the most sensitive to changes in drug product performance (i.e. differences in in vivo dissolution) and to make recommendations based on the simulations outcome. A total of 128 scenarios (2 Biopharmaceutics Classification System (BCS) drug types, 2 levels of K M Pgp , in 4 metabolic scenarios at 2 dose levels in 4 quality levels of the drug product) were simulated for BCS class II and IV drugs. Monte Carlo simulations of all bioequivalence studies were performed in NONMEM 7.3. Results showed the parent drug (PD) was the most sensitive analyte for bioequivalence trials in all the studied scenarios. PM and SM revealed less or the same sensitivity to detect differences in pharmaceutical quality as the PD. Another relevant result is that mean point estimate of C max and AUC methodology from Monte Carlo simulations allows to select more accurately the most sensitive analyte compared to the criterion on the percentage of failed or successful BE studies, even for metabolites which frequently show greater variability than PD. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Air quality and acute myocardial infarction in adults during the 2016 Hangzhou G20 summit.

    PubMed

    Wang, Ming-Wei; Chen, Juan; Cai, Ran

    2018-04-01

    To fulfill its commitment to a successful Hangzhou G20 summit (4 to 5 September 2016), the Chinese government implemented a series of measures to improve the air quality in Hangzhou. We report findings on air quality and acute myocardial infarction (AMI) hospital admissions in adults during the Hangzhou G20 summit. Three study periods were defined. The first period was pre-G20 (28 July to 27 August: limited restrictions on industrial emissions). The second period was G20 (28 August to 6 September) when there were further restrictions on industrial emissions and increased transportation restrictions. The third period was post-G20 (7 September to 6 October) when restrictions were relaxed again. The mean number of AMI admissions per day was, respectively, 8.2 during G20, 13.3 during pre-G20, and 15.1 during post-G20. We used time-series Poisson regression models to estimate the relative risk (RR) for AMI associated with pollution levels. Our results suggest that the air quality improvement can reduce the number of hospital admissions for AMI.

  1. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    PubMed

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Core components of a comprehensive quality assurance program in anatomic pathology.

    PubMed

    Nakhleh, Raouf E

    2009-11-01

    In this article the core components of a comprehensive quality assurance and improvement plan are outlined. Quality anatomic pathology work comes with focus on accurate, timely, and complete reports. A commitment to continuous quality improvement and a systems approach with a persistent effort helps to achieve this end. Departments should have a quality assurance and improvement plan that includes a risk assessment of real and potential problems facing the laboratory. The plan should also list the individuals responsible for carrying out the program with adequate resources, a defined timetable, and annual assessment for progress and future directions. Quality assurance monitors should address regulatory requirements and be organized by laboratory division (surgical pathology, cytology, etc) as well as 5 segments (preanalytic, analytic, postanalytic phases of the test cycle, turn-around-time, and customer satisfaction). Quality assurance data can also be used to evaluate individual pathologists using multiple parameters with peer group comparison.

  3. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493... Nonwaived Testing Analytic Systems § 493.1289 Standard: Analytic systems quality assessment. (a) The... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...

  4. Transformation in the pharmaceutical industry: transformation-induced quality risks--a survey.

    PubMed

    Shafiei, Nader; Ford, James L; Morecroft, Charles W; Lisboa, Paulo J; Taylor, Mark J; Mouzughi, Yusra

    2013-01-01

    This paper is the fourth in a series that explores ongoing transformation in the pharmaceutical industry and its impact on pharmaceutical quality from the perspective of risk identification. The aim of this paper is to validate proposed quality risks through elicitation of expert opinion and define the resultant quality risk model. Expert opinion was obtained using a questionnaire-based survey with participants with recognized expertise in pharmaceutical regulation, product lifecycle, or technology. The results of the survey validate the theoretical and operational evidence in support of the four main pharmaceutical transformation triggers previously identified. The quality risk model resulting from the survey indicated a firm relationship between the pharmaceutical quality risks and regulatory compliance outcomes during the marketing approval and post-marketing phases of the product lifecycle and a weaker relationship during the pre-market evaluation phase. In this paper through conduct of an expert opinion survey the proposed quality risks carried forward from an earlier part of the research are validated and resultant quality risk model is defined. The survey results validate the theoretical and operational evidence previously identified. The quality risk model indicates that transformation-related risks have a larger regulatory compliance impact during product approval, manufacturing, distribution, and commercial use than during the development phase.

  5. Choosing Pre-conception Planning for Women/Families: Counselling and Informed Consent (Part 2) - Pre-conception Reproductive Planning, Lifestyle, Immunization, and Psychosocial Issues.

    PubMed

    Wilson, R Douglas

    2017-12-06

    To inform reproductive and other health care providers about pre-conception evaluation, including considerations for reproductive planning, lifestyle modification, immunization status and attitudes, and psychosocial issues. This counselling information can be used for patient education and planning and possible pre-conception and/or prenatal testing. This information may allow for improved risk assessment when pre-conception counselling for individual patients and their families is used. CONSIDERATIONS FOR PRE-CONCEPTION CARE (PART 2) REGARDING PRE-CONCEPTION REPRODUCTIVE PLANNING, LIFESTYLE, IMMUNIZATIONS, AND PSYCHOSOCIAL ISSUES: CONSIDERATION FOR CARE STATEMENTS: For this review article, the Consideration for Care Statements use the Grading of Recommendations, Assessment, Development and Evaluations strength and quality principles because they are comparable for the clinician and the patient/public user. For example, "Strong" for clinicians is defined as "the recommendation would apply to most individuals. Formal discussion aids are not likely to be needed to help individuals make decisions consistent with their values and preferences." For patients/the public, "Strong" is defined as, "we believe most people in this situation would want the recommended course of actions and only a small number would not." Quality of evidence (High, Moderate, Low) is based on the confidence that the true effect lies close to that of the estimate of the effect. In addition, the Canadian Task Force on Preventive Health Care key to evidence statements and grading of recommendations are included. PubMed, Medline, and the Cochrane Database were searched until May 2017, using appropriate key words (i.e., preconception, reproductive planning, lifestyle modification, immunization risks and benefits, psychosocial pregnancy factors/issues). Grey (unpublished) literature was identified through searching websites of health technology assessment and health technology assessment-related agencies, clinical practice guideline collections, and national and international medical specialty societies. The benefits for the patient and her family from receiving this pre-conception counseling would include an increased understanding of the relevant issues for both pre-conception and in early pregnancy as well as better pregnancy outcomes. Harm includes potential increased anxiety or psychological stress associated with the possibility of identifying maternal pregnancy risks. Copyright © 2017 Society of Obstetricians and Gynaecologists of Canada. Published by Elsevier Inc. All rights reserved.

  6. Improving Histopathology Laboratory Productivity: Process Consultancy and A3 Problem Solving.

    PubMed

    Yörükoğlu, Kutsal; Özer, Erdener; Alptekin, Birsen; Öcal, Cem

    2017-01-01

    The ISO 17020 quality program has been run in our pathology laboratory for four years to establish an action plan for correction and prevention of identified errors. In this study, we aimed to evaluate the errors that we could not identify through ISO 17020 and/or solve by means of process consulting. Process consulting is carefully intervening in a group or team to help it to accomplish its goals. The A3 problem solving process was run under the leadership of a 'workflow, IT and consultancy manager'. An action team was established consisting of technical staff. A root cause analysis was applied for target conditions, and the 6-S method was implemented for solution proposals. Applicable proposals were activated and the results were rated by six-sigma analysis. Non-applicable proposals were reported to the laboratory administrator. A mislabelling error was the most complained issue triggering all pre-analytical errors. There were 21 non-value added steps grouped in 8 main targets on the fish bone graphic (transporting, recording, moving, individual, waiting, over-processing, over-transaction and errors). Unnecessary redundant requests, missing slides, archiving issues, redundant activities, and mislabelling errors were proposed to be solved by improving visibility and fixing spaghetti problems. Spatial re-organization, organizational marking, re-defining some operations, and labeling activities raised the six sigma score from 24% to 68% for all phases. Operational transactions such as implementation of a pathology laboratory system was suggested for long-term improvement. Laboratory management is a complex process. Quality control is an effective method to improve productivity. Systematic checking in a quality program may not always find and/or solve the problems. External observation may reveal crucial indicators about the system failures providing very simple solutions.

  7. Management of thyroid cytological material, pre-analytical procedures and bio-banking.

    PubMed

    Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo

    2018-06-09

    Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Danish evidence-based clinical guideline for use of nutritional support in pulmonary rehabilitation of undernourished patients with stable COPD.

    PubMed

    Beck, Anne Marie; Iepsen, Ulrik Winning; Tobberup, Randi; Jørgensen, Karsten Juhl

    2015-02-01

    Disease-related under-nutrition is a common problem in individuals with COPD. The rationale for nutritional support in pulmonary rehabilitation therefore seems obvious. However there is limited evidence regarding the patient-relevant outcomes i.e. activities of daily living (ADL) or quality of life. Therefore the topic was included in The Danish Health and Medicines Authority's development of an evidence-based clinical guideline for rehabilitation of patients with stable COPD. The methods were specified by The Danish Health and Medicines Authority as part of a standardized approach to evidence-based national clinical practice guidelines. They included formulation of a PICO with pre-defined criteria for the Population, Intervention, Control and Outcomes. Existing guidelines or systematic reviews were used after assessment using the AGREE II tool or AMSTAR, if possible. We identified primary studies by means of a systematic literature search (July to December 2013), and any identified studies were then quality assessed using the Cochrane risk of bias tool and the GRADE approach. The extracted data on our pre-defined outcomes were summarized in meta-analyses when possible, or meta-analyses from existing guidelines or systematic reviews were adapted. The results were used for labeling and wording of the recommendations. Data from 12 randomized controlled trials were included in a systematic review, which formed the basis for our recommendations as no new primary studies had been published. There were evidence of moderate quality that nutritional support for undernourished patients with COPD lead to a weight gain of 1.7kg (95% confidence interval: 1.3 to 2.2kg), but the effect was quantified as a mean change from baseline, which is less reliable. There were evidence of moderate quality that nutritional therapy does not increase in the 6 minute walking distance of 13 m (95% confidence interval: -27 to 54 m) when results in the intervention and control groups were compared at 9-16 weeks of follow-up. There was evidence of very low quality for an increase in lean body mass. The studies did not demonstrate an effect on either quality of life or ADL in patients with COPD. Some pre-defined outcomes (adverse events, hospital admissions and mortality) were not quantified. The evidence base for nutritional supplementation in rehabilitation of COPD patients is weak and any effect was limited to surrogate markers, such as increased weight and lean body mass, while an effect could not be seen on patient-relevant outcomes such as quality of life or activities of daily living. The intervention was given a weak recommendation. Copyright © 2014.

  9. [Biochemical markers of bone remodeling: pre-analytical variations and guidelines for their use. SFBC (Société Française de Biologie Clinique) Work Group. Biochemical markers of bone remodeling].

    PubMed

    Garnero, P; Bianchi, F; Carlier, M C; Genty, V; Jacob, N; Kamel, S; Kindermans, C; Plouvier, E; Pressac, M; Souberbielle, J C

    2000-01-01

    Biochemical markers of bone turnover have been developed over the past 20 years that are more specific for bone tissue than conventional ones such as total alkaline phosphatase and urinary hydroxyproline. They have been widely used in clinical research and in clinical trials of new therapies as secondary end points of treatment efficacy. Most of the interest has been devoted to their use in postmenopausal osteoporosis, a condition characterized by subtle modifications of bone metabolism that cannot be detected readily by conventional markers of bone turnover. Although several recent studies have suggested that biochemical markers may be used for the management of the individual patient in routine clinical practice, this has not been clearly defined and is a matter of debate. Because of the crucial importance to clarify this issue, the Société Francaise de Biologie Clinique prompted an expert committee to summarize the available data and to make recommendations. The following paper includes a review on the biochemical and analytical aspects of the markers of bone formation and resorption and on the sources of variability such as sex, age, menstrual cycle, pregnancy and lactation, physical activity, seasonal variation and effects of diseases and treatments. We will also describe the effects of pre-analytical factors on the measurements of the different markers. Finally based on that review, we will make practical recommendations for the use of these markers in order to minimize the variability of the measurements and improve the clinical interpretation of the data.

  10. Experimental testing and modeling analysis of solute mixing at water distribution pipe junctions.

    PubMed

    Shao, Yu; Jeffrey Yang, Y; Jiang, Lijie; Yu, Tingchao; Shen, Cheng

    2014-06-01

    Flow dynamics at a pipe junction controls particle trajectories, solute mixing and concentrations in downstream pipes. The effect can lead to different outcomes of water quality modeling and, hence, drinking water management in a distribution network. Here we have investigated solute mixing behavior in pipe junctions of five hydraulic types, for which flow distribution factors and analytical equations for network modeling are proposed. First, based on experiments, the degree of mixing at a cross is found to be a function of flow momentum ratio that defines a junction flow distribution pattern and the degree of departure from complete mixing. Corresponding analytical solutions are also validated using computational-fluid-dynamics (CFD) simulations. Second, the analytical mixing model is further extended to double-Tee junctions. Correspondingly the flow distribution factor is modified to account for hydraulic departure from a cross configuration. For a double-Tee(A) junction, CFD simulations show that the solute mixing depends on flow momentum ratio and connection pipe length, whereas the mixing at double-Tee(B) is well represented by two independent single-Tee junctions with a potential water stagnation zone in between. Notably, double-Tee junctions differ significantly from a cross in solute mixing and transport. However, it is noted that these pipe connections are widely, but incorrectly, simplified as cross junctions of assumed complete solute mixing in network skeletonization and water quality modeling. For the studied pipe junction types, analytical solutions are proposed to characterize the incomplete mixing and hence may allow better water quality simulation in a distribution network. Published by Elsevier Ltd.

  11. An interrupted time series analysis showed suboptimal improvement in reporting quality of trial abstract.

    PubMed

    Chhapola, Viswas; Tiwari, Soumya; Brar, Rekha; Kanwal, Sandeep Kumar

    2016-03-01

    To assess and compare the immediate and long-term change in reporting quality of randomized controlled trial (RCT) abstracts published in Pediatrics, The Journal of Pediatrics, and JAMA Pediatrics before and after the publication of Consolidated Standards of Reporting Trial (CONSORT)-abstract statement. Study had "Interrupted time-series" design. Eligible RCT abstracts were retrieved by PubMed search in two study periods from January 2003 to December 2007 (pre-CONSORT) and January 2010 to December 2014 (post-CONSORT). These abstracts were matched with the CONSORT checklist for abstracts. The primary outcome measure was CONSORT-abstract score defined as number of CONSORT items correctly reported divided by 18 and expressed as percentage. The mean percentage scores were used to compare reporting quality between pre- and post-CONSORT using segmented linear regression. A total of 424 RCT abstracts in pre-CONSORT and 467 in post-CONSORT were analyzed. A significant change in slope of regression line between two time periods (0.151 [confidence interval CI, 0.004-0.298], P = 0.044) was observed. Intercepts did not show a significant difference (-2.39 [CI, 4.93-0.157], P = 0.065). The overall reporting quality of RCT abstracts in the high-impact pediatrics journals was suboptimal (<50%); however, it improved when assessed over a 5-year period, implying slow but gradual adoption of guideline. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Recent advances in immunosensor for narcotic drug detection

    PubMed Central

    Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman

    2015-01-01

    Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925

  13. Risk and utility in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Natoli, Vincent D.

    2003-06-01

    Modern portfolio theory (MPT) addresses the problem of determining the optimum allocation of investment resources among a set of candidate assets. In the original mean-variance approach of Markowitz, volatility is taken as a proxy for risk, conflating uncertainty with risk. There have been many subsequent attempts to alleviate that weakness which, typically, combine utility and risk. We present here a modification of MPT based on the inclusion of separate risk and utility criteria. We define risk as the probability of failure to meet a pre-established investment goal. We define utility as the expectation of a utility function with positive and decreasing marginal value as a function of yield. The emphasis throughout is on long investment horizons for which risk-free assets do not exist. Analytic results are presented for a Gaussian probability distribution. Risk-utility relations are explored via empirical stock-price data, and an illustrative portfolio is optimized using the empirical data.

  14. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A guide for measurement of circulating metabolic hormones in rodents: Pitfalls during the pre-analytical phase

    PubMed Central

    Bielohuby, Maximilian; Popp, Sarah; Bidlingmaier, Martin

    2012-01-01

    Researchers analyse hormones to draw conclusions from changes in hormone concentrations observed under specific physiological conditions and to elucidate mechanisms underlying their biological variability. It is, however, frequently overlooked that also circumstances occurring after collection of biological samples can significantly affect the hormone concentrations measured, owing to analytical and pre-analytical variability. Whereas the awareness for such potential confounders is increasing in human laboratory medicine, there is sometimes limited consensus about the control of these factors in rodent studies. In this guide, we demonstrate how such factors can affect reliability and consequent interpretation of the data from immunoassay measurements of circulating metabolic hormones in rodent studies. We also compare the knowledge about such factors in rodent studies to recent recommendations established for biomarker studies in humans and give specific practical recommendations for the control of pre-analytical conditions in metabolic studies in rodents. PMID:24024118

  16. Critical and systematic evaluation of data for estimating human exposures to 2,4-dichlorophenoxyacetic acid (2,4-D) - quality and generalizability.

    PubMed

    LaKind, Judy S; Burns, Carol J; Naiman, Daniel Q; O'Mahony, Cian; Vilone, Giulia; Burns, Annette J; Naiman, Joshua S

    2017-01-01

    The herbicide 2,4-dichlorophenoxyacetic acid (2,4-D) has been commercially available since the 1940's. Despite decades of data on 2,4-D in food, air, soil, and water, as well as in humans, the quality the quality of these data has not been comprehensively evaluated. Using selected elements of the Biomonitoring, Environmental Epidemiology, and Short-lived Chemicals (BEES-C) instrument (temporal variability, avoidance of sample contamination, analyte stability, and urinary methods of matrix adjustment), the quality of 156 publications of environmental- and biomonitoring-based 2,4-D data was examined. Few publications documented steps were taken to avoid sample contamination. Similarly, most studies did not demonstrate the stability of the analyte from sample collection to analysis. Less than half of the biomonitoring publications reported both creatinine-adjusted and unadjusted urine concentrations. The scope and detail of data needed to assess temporal variability and sources of 2,4-D varied widely across the reviewed studies. Exposures to short-lived chemicals such as 2,4-D are impacted by numerous and changing external factors including application practices and formulations. At a minimum, greater transparency in reporting of quality control measures is needed. Perhaps the greatest challenge for the exposure community is the ability to reach consensus on how to address problems specific to short-lived chemical exposures in observational epidemiology investigations. More extensive conversations are needed to advance our understanding of human exposures and enable interpretation of these data to catch up to analytical capabilities. The problems defined in this review remain exquisitely difficult to address for chemicals like 2,4-D, with short and variable environmental and physiological half-lives and with exposures impacted by numerous and changing external factors.

  17. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    PubMed

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre-analytical variability. The successful adoption of such recommendations necessitates the close collaboration of various stakeholders at the global stage, including the laboratories, the medical community, the reagent manufacturers and the regulatory agencies.

  18. Shared decision-making – transferring research into practice: the Analytic Hierarchy Process (AHP)

    PubMed Central

    Dolan, James G.

    2008-01-01

    Objective To illustrate how the Analytic Hierarchy Process (AHP) can be used to promote shared decision-making and enhance clinician-patient communication. Methods Tutorial review. Results The AHP promotes shared decision making by creating a framework that is used to define the decision, summarize the information available, prioritize information needs, elicit preferences and values, and foster meaningful communication among decision stakeholders. Conclusions The AHP and related multi-criteria methods have the potential for improving the quality of clinical decisions and overcoming current barriers to implementing shared decision making in busy clinical settings. Further research is needed to determine the best way to implement these tools and to determine their effectiveness. Practice Implications Many clinical decisions involve preference-based trade-offs between competing risks and benefits. The AHP is a well-developed method that provides a practical approach for improving patient-provider communication, clinical decision-making, and the quality of patient care in these situations. PMID:18760559

  19. Report for Batch Leach Analyses on Sediments at 100-HR-3 Operable Unit, Boreholes C7620, C7621, C7622, C7623, C7626, C7627, C7628, C7629, C7630, and C7866. Revision 1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindberg, Michael J.

    2012-04-25

    This is a revision to a previously released report. This revision contains additional analytical results for the sample with HEIS number B2H4X7. Between November 4, 2010 and October 26, 2011 sediment samples were received from 100-HR-3 Operable Unit for geochemical studies. The analyses for this project were performed at the 331 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory andmore » analytical quality control requirements, calibration requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL. Samples were received with a chain of custody (COC) and were analyzed according to the sample identification numbers supplied by the client. All Samples were refrigerated upon receipt until prepared for analysis. All samples were received with custody seals intact unless noted in the Case Narrative. Holding time is defined as the time from sample preparation to the time of analyses. The prescribed holding times were met for all analytes unless noted in the Case Narrative. All reported analytical results meet the requirements of the CAW or client specified SOW unless noted in the case narrative. Due to the requirements of the statement of work and sampling events in the field, the 28 day and the 48 hr requirements cannot be met. The statement of work requires samples to be selected at the completion of the borehole. It is not always possible to complete a borehole and have the samples shipped to the laboratory within the hold time requirements. Duplicate RPD for Uranium 238 (38.9%) was above the acceptance limit (35) in 1E05003-DUP1 for ICPMS-Tc-U-WE The sample result is less than 10 times the detection limits. Duplicate recoveries are not applicable to this analyte. Duplicate RPD for Silver 107 (68.2%) was above the acceptance limit (35) in 2C06004-DUP1 for ICPMS-RCRA-AE The sample result is less than 10 times the detection limits. Duplicate recoveries are not applicable to this analyte. Matrix Spike Recovery for Chromium, Hexavalent (48.8%) was outside acceptance limits (75-125) in 1E23001-MS1 for Hexavalent Chromium/Soil. Potential Matrix interference. Sample results associated with this batch are below the EQL. There should be no impact to the data as reported. Matrix Spike Recovery for Chromium, Hexavalent (50.2%) was outside acceptance limits (75-125) in 2B22010-MS1 for Hexavalent Chromium/Soil. Potential Matrix interference. Sample results associated with this batch are below the EQL. There should be no impact to the data as reported.« less

  20. Point-of-care test (POCT) INR: hope or illusion?

    PubMed

    Dusse, Luci Maria Sant'Ana; Oliveira, Nataly Carvalho; Rios, Danyelle Romana Alves; Marcolino, Milena Soriano

    2012-01-01

    In the last decade, point-of-care tests were developed to provide rapid generation of test results. These tests have increasingly broad applications. In the area of hemostasis, the international normalized ratio, INR point-of-care test (POCT INR), is the main test of this new proposal. This test has great potential benefit in situations where the quick INR results influences clinical decision making, as in acute ischemic stroke, before surgical procedures and during cardiac surgery. The INR POCT has the potential to be used for self-monitoring of oral anticoagulation in patients under anticoagulant therapy. However, the precision and accuracy of INR POCT still need to be enhanced to increase effectiveness and efficiency of the test. Additionally, the RDC / ANVISA Number 302 makes clear that the POCT testing must be supervised by the technical manager of the Clinical Laboratory in the pre-analytical, analytical and post-analytical. In practice, the Clinical Laboratory does not participate in the implementation of POCT testing or release of the results. Clinicians have high expectation with the incorporation of INR POCT in clinical practice, despite the limitations of this method. These professionals are willing to train the patient to perform the test, but are not legally responsible for the quality of it and are not prepared for the maintenance of equipment. The definition of who is in charge for the test must be one to ensure the quality control.

  1. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    PubMed

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Chemical Analysis Results for Potable Water from ISS Expeditions 21 to 25

    NASA Technical Reports Server (NTRS)

    Straub, John E., II; Plumlee, Debrah K.; Schultz, John R.; McCoy, J. Torin

    2010-01-01

    The Johnson Space Center Water and Food Analytical Laboratory (WAFAL) performed detailed ground-based analyses of archival water samples for verification of the chemical quality of the International Space Station (ISS) potable water supplies for Expeditions 21 to 25. Over a 14-month period, the Space Shuttle visited the ISS on five occasions to complete construction and deliver supplies. The onboard supplies of potable water available for consumption by the Expeditions 21 to 25 crews consisted of Russian ground-supplied potable water, Russian potable water regenerated from humidity condensate, and US potable water recovered from urine distillate and condensate. Chemical archival water samples that were collected with U.S. hardware during Expeditions 21 to 25 were returned on Shuttle flights STS-129 (ULF3), STS-130 (20A), STS-131 (19A), STS-132 (ULF4) and STS-133 (ULF5), as well as on Soyuz flights 19-22. This paper reports the analytical results for the returned archival water samples and evaluates their compliance with ISS water quality standards. The WAFAL also received and analyzed aliquots of some Russian potable water samples collected in-flight and pre-flight samples of Rodnik potable water delivered to the Station on the Russian Progress vehicle during Expeditions 21 to 25. These additional analytical results are also reported and discussed in this paper.

  3. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  4. [Drinking water quality and safety].

    PubMed

    Gómez-Gutiérrez, Anna; Miralles, Maria Josepa; Corbella, Irene; García, Soledad; Navarro, Sonia; Llebaria, Xavier

    2016-11-01

    The purpose of drinking water legislation is to guarantee the quality and safety of water intended for human consumption. In the European Union, Directive 98/83/EC updated the essential and binding quality criteria and standards, incorporated into Spanish national legislation by Royal Decree 140/2003. This article reviews the main characteristics of the aforementioned drinking water legislation and its impact on the improvement of water quality against empirical data from Catalonia. Analytical data reported in the Spanish national information system (SINAC) indicate that water quality in Catalonia has improved in recent years (from 88% of analytical reports in 2004 finding drinking water to be suitable for human consumption, compared to 95% in 2014). The improvement is fundamentally attributed to parameters concerning the organoleptic characteristics of water and parameters related to the monitoring of the drinking water treatment process. Two management experiences concerning compliance with quality standards for trihalomethanes and lead in Barcelona's water supply are also discussed. Finally, this paper presents some challenges that, in the opinion of the authors, still need to be incorporated into drinking water legislation. It is necessary to update Annex I of Directive 98/83/EC to integrate current scientific knowledge, as well as to improve consumer access to water quality data. Furthermore, a need to define common criteria for some non-resolved topics, such as products and materials in contact with drinking water and domestic conditioning equipment, has also been identified. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  6. The Relationship Between Magnet Designation, Electronic Health Record Adoption, and Medicare Meaningful Use Payments.

    PubMed

    Lippincott, Christine; Foronda, Cynthia; Zdanowicz, Martin; McCabe, Brian E; Ambrosia, Todd

    2017-08-01

    The objective of this study was to examine the relationship between nursing excellence and electronic health record adoption. Of 6582 US hospitals, 4939 were eligible for the Medicare Electronic Health Record Incentive Program, and 6419 were eligible for evaluation on the HIMSS Analytics Electronic Medical Record Adoption Model. Of 399 Magnet hospitals, 330 were eligible for the Medicare Electronic Health Record Incentive Program, and 393 were eligible for evaluation in the HIMSS Analytics Electronic Medical Record Adoption Model. Meaningful use attestation was defined as receipt of a Medicare Electronic Health Record Incentive Program payment. The adoption electronic health record was defined as Level 6 and/or 7 on the HIMSS Analytics Electronic Medical Record Adoption Model. Logistic regression showed that Magnet-designated hospitals were more likely attest to Meaningful Use than non-Magnet hospitals (odds ratio = 3.58, P < .001) and were more likely to adopt electronic health records than non-Magnet hospitals (Level 6 only: odds ratio = 3.68, P < .001; Level 6 or 7: odds ratio = 4.02, P < .001). This study suggested a positive relationship between Magnet status and electronic health record use, which involves earning financial incentives for successful adoption. Continued investigation is needed to examine the relationships between the quality of nursing care, electronic health record usage, financial implications, and patient outcomes.

  7. [Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].

    PubMed

    Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D

    2016-01-01

    The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.

  8. Impact of pre-imputation SNP-filtering on genotype imputation results

    PubMed Central

    2014-01-01

    Background Imputation of partially missing or unobserved genotypes is an indispensable tool for SNP data analyses. However, research and understanding of the impact of initial SNP-data quality control on imputation results is still limited. In this paper, we aim to evaluate the effect of different strategies of pre-imputation quality filtering on the performance of the widely used imputation algorithms MaCH and IMPUTE. Results We considered three scenarios: imputation of partially missing genotypes with usage of an external reference panel, without usage of an external reference panel, as well as imputation of completely un-typed SNPs using an external reference panel. We first created various datasets applying different SNP quality filters and masking certain percentages of randomly selected high-quality SNPs. We imputed these SNPs and compared the results between the different filtering scenarios by using established and newly proposed measures of imputation quality. While the established measures assess certainty of imputation results, our newly proposed measures focus on the agreement with true genotypes. These measures showed that pre-imputation SNP-filtering might be detrimental regarding imputation quality. Moreover, the strongest drivers of imputation quality were in general the burden of missingness and the number of SNPs used for imputation. We also found that using a reference panel always improves imputation quality of partially missing genotypes. MaCH performed slightly better than IMPUTE2 in most of our scenarios. Again, these results were more pronounced when using our newly defined measures of imputation quality. Conclusion Even a moderate filtering has a detrimental effect on the imputation quality. Therefore little or no SNP filtering prior to imputation appears to be the best strategy for imputing small to moderately sized datasets. Our results also showed that for these datasets, MaCH performs slightly better than IMPUTE2 in most scenarios at the cost of increased computing time. PMID:25112433

  9. [Diagnostic kits in parasitology: which controls?].

    PubMed

    Rossi, P

    2004-06-01

    The development of new diagnostic tools particularly for some parasitic "neglected diseases", is slowed or even hindered by limited resources assigned for basic and applied research in public institution and private sector. Even if the time-line and costs needed for developing a new In Vitro Diagnostic (IVD) test are generally lower compared to vaccines or new drugs, industry is poorly engaged in investing resources due to the perception of limited markets. To accelerate the development of diagnostics for the world's most deadly diseases, the World Health Organization's (WHO) Special Programme for Research and Training in Tropical Diseases (TDR), the United Nations Development Programme, the World Bank and the Gates Foundation, last year launched a new initiative, FIND (Foundation for Innovative New Diagnostics, www.finddiagnostics.org). The aim is to "apply the latest biotechnology innovations to develop and validate affordable diagnostic tests for diseases of the developing world". Ideally, a new diagnostic test should be accurately evaluated prior to use in medical practice. The first step would be a pre-clinical evaluation, an analytic study to determine its laboratory performance. A crucial point in this phase is the calibration of reagents (antigens, antibodies, DNA probes, etc.) against a standard reference preparation. WHO, through the WHO International Laboratories for Biological Standards, "provides International Biological Reference Preparations which serve as reference sources of defined biological activity expressed in an internationally agreed unit" (www.who.int/biologicals/IBRP/index.htm). Standardization allows "comparison of biological measurements worldwide" and ensures the reliability of diagnostic procedures. These preparations are generally intended for use in the characterization of the activity of secondary reference preparations (regional, national or in-house working standards). Unfortunately, international reference standards for parasitic diseases are not available at present, except for Toxoplasma antibodies. The first international standard reagent for Anti-Toxoplasma Serum was established in 1968 and at present, an international standard reference serum, Anti-toxoplasma serum, human TOXM is available at the National Institute for Biological Standards and Control (NIBSC) in UK. Several collaborative, multicenter studies were carried out to assess the performance of different methods and commercial tests for the diagnosis of toxoplasmosis, by providing to participating laboratories a panel of well-defined sera to be tested. A four-phase process following well-accepted methodological standards for the development of diagnostics, analogous to those internationally accepted for drugs and vaccines was recently proposed. The pre-clinical evaluation, the analytic study to assess sensitivity, specificity, predictive values in laboratory (phase I), should be followed by a proof of principle study to distinguish diseased from healthy persons in easily accessible populations (phase II). The evaluation of test performance in populations of intended use (phase III), and finally the delineation of cost-effectiveness and societal impact of new tests in comparison with existing tools (phase IV) should complete the validation procedure. In this context, national regulatory agencies play a major role in pre-market approval and post-market surveillance of IVDs. The European Community in 1998 approved a directive (Directive 98/79/EC) which rules the marketing of IVD medical devices, in order to harmonise the performance levels and standards in European countries. But, among IVDs for parasitic diseases, only those to detect congenital toxoplasmosis are submitted to defined procedures to provide the verification of products before their placing on the market and the surveillance after their marketing by a notified body, which perform appropriate examinations, tests and inspections to production facilities to verify if the device meets the requirements of the directive. In U.S.A., the Food and Drug Administration (FDA), through the Office of In Vitro Diagnostic Device Evaluation and Safety (OIVD), provides a comprehensive and regulatory activity for IVDs through pre-market evaluation and post-market surveillance. In developing countries, the scarcity of resources limits the procedures through which the national control authority can assure safety, quality and efficacy of products marketed, both imported and locally manufactured.

  10. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Evaluation of the quality of care of a multi-disciplinary risk factor assessment and management programme (RAMP) for diabetic patients

    PubMed Central

    2012-01-01

    Background Type 2 Diabetes Mellitus (DM) is a common chronic disease associated with multiple clinical complications. Management guidelines have been established which recommend a risk-stratified approach to managing these patients in primary care. This study aims to evaluate the quality of care (QOC) and effectiveness of a multi-disciplinary risk assessment and management programme (RAMP) for type 2 diabetic patients attending government-funded primary care clinics in Hong Kong. The evaluation will be conducted using a structured and comprehensive evidence-based evaluation framework. Method/design For evaluation of the quality of care, a longitudinal study will be conducted using the Action Learning and Audit Spiral methodologies to measure whether the pre-set target standards for criteria related to the structure and process of care are achieved. Each participating clinic will be invited to complete a Structure of Care Questionnaire evaluating pre-defined indicators which reflect the setting in which care is delivered, while process of care will be evaluated against the pre-defined indicators in the evaluation framework. Effectiveness of the programme will be evaluated in terms of clinical outcomes, service utilization outcomes, and patient-reported outcomes. A cohort study will be conducted on all eligible diabetic patients who have enrolled into RAMP for more than one year to compare their clinical and public service utilization outcomes of RAMP participants and non-participants. Clinical outcome measures will include HbA1c, blood pressure (both systolic and diastolic), lipids (low-density lipoprotein cholesterol) and future cardiovascular diseases risk prediction; and public health service utilization rate will include general and specialist outpatient, emergency department attendances, and hospital admissions annually within 5 years. For patient-reported outcomes, a total of 550 participants and another 550 non-participants will be followed by telephone to monitor quality of life, patient enablement, global rating of change in health and private health service utilization at baseline, 6, 12, 36 and 60 months. Discussion The quality of care and effectiveness of the RAMP in enhancing the health for patients with type 2 diabetes will be determined. Possible areas for quality enhancement will be identified and standards of good practice can be established. The information will be useful in guiding service planning and policy decision making. PMID:23216708

  12. Evaluation of the quality of care of a multi-disciplinary risk factor assessment and management programme (RAMP) for diabetic patients.

    PubMed

    Fung, Colman S C; Chin, Weng Yee; Dai, Daisy S K; Kwok, Ruby L P; Tsui, Eva L H; Wan, Yuk Fai; Wong, Wendy; Wong, Carlos K H; Fong, Daniel Y T; Lam, Cindy L K

    2012-12-05

    Type 2 Diabetes Mellitus (DM) is a common chronic disease associated with multiple clinical complications. Management guidelines have been established which recommend a risk-stratified approach to managing these patients in primary care. This study aims to evaluate the quality of care (QOC) and effectiveness of a multi-disciplinary risk assessment and management programme (RAMP) for type 2 diabetic patients attending government-funded primary care clinics in Hong Kong. The evaluation will be conducted using a structured and comprehensive evidence-based evaluation framework. For evaluation of the quality of care, a longitudinal study will be conducted using the Action Learning and Audit Spiral methodologies to measure whether the pre-set target standards for criteria related to the structure and process of care are achieved. Each participating clinic will be invited to complete a Structure of Care Questionnaire evaluating pre-defined indicators which reflect the setting in which care is delivered, while process of care will be evaluated against the pre-defined indicators in the evaluation framework.Effectiveness of the programme will be evaluated in terms of clinical outcomes, service utilization outcomes, and patient-reported outcomes. A cohort study will be conducted on all eligible diabetic patients who have enrolled into RAMP for more than one year to compare their clinical and public service utilization outcomes of RAMP participants and non-participants. Clinical outcome measures will include HbA1c, blood pressure (both systolic and diastolic), lipids (low-density lipoprotein cholesterol) and future cardiovascular diseases risk prediction; and public health service utilization rate will include general and specialist outpatient, emergency department attendances, and hospital admissions annually within 5 years. For patient-reported outcomes, a total of 550 participants and another 550 non-participants will be followed by telephone to monitor quality of life, patient enablement, global rating of change in health and private health service utilization at baseline, 6, 12, 36 and 60 months. The quality of care and effectiveness of the RAMP in enhancing the health for patients with type 2 diabetes will be determined. Possible areas for quality enhancement will be identified and standards of good practice can be established. The information will be useful in guiding service planning and policy decision making.

  13. The coverage and frequency of mass drug administration required to eliminate persistent transmission of soil-transmitted helminths

    PubMed Central

    Anderson, Roy; Truscott, James; Hollingsworth, T. Deirdre

    2014-01-01

    A combination of methods, including mathematical model construction, demographic plus epidemiological data analysis and parameter estimation, are used to examine whether mass drug administration (MDA) alone can eliminate the transmission of soil-transmitted helminths (STHs). Numerical analyses suggest that in all but low transmission settings (as defined by the magnitude of the basic reproductive number, R0), the treatment of pre-school-aged children (pre-SAC) and school-aged children (SAC) is unlikely to drive transmission to a level where the parasites cannot persist. High levels of coverage (defined as the fraction of an age group effectively treated) are required in pre-SAC, SAC and adults, if MDA is to drive the parasite below the breakpoint under which transmission is eliminated. Long-term solutions to controlling helminth infections lie in concomitantly improving the quality of the water supply, sanitation and hygiene (WASH). MDA, however, is a very cost-effective tool in long-term control given that most drugs are donated free by the pharmaceutical industry for poor regions of the world. WASH interventions, by lowering the basic reproductive number, can facilitate the ability of MDA to interrupt transmission. PMID:24821921

  14. The coverage and frequency of mass drug administration required to eliminate persistent transmission of soil-transmitted helminths.

    PubMed

    Anderson, Roy; Truscott, James; Hollingsworth, T Deirdre

    2014-01-01

    A combination of methods, including mathematical model construction, demographic plus epidemiological data analysis and parameter estimation, are used to examine whether mass drug administration (MDA) alone can eliminate the transmission of soil-transmitted helminths (STHs). Numerical analyses suggest that in all but low transmission settings (as defined by the magnitude of the basic reproductive number, R0), the treatment of pre-school-aged children (pre-SAC) and school-aged children (SAC) is unlikely to drive transmission to a level where the parasites cannot persist. High levels of coverage (defined as the fraction of an age group effectively treated) are required in pre-SAC, SAC and adults, if MDA is to drive the parasite below the breakpoint under which transmission is eliminated. Long-term solutions to controlling helminth infections lie in concomitantly improving the quality of the water supply, sanitation and hygiene (WASH). MDA, however, is a very cost-effective tool in long-term control given that most drugs are donated free by the pharmaceutical industry for poor regions of the world. WASH interventions, by lowering the basic reproductive number, can facilitate the ability of MDA to interrupt transmission.

  15. Alternative indicators for monitoring the quality of a continuous intervention program on antibiotic prescribing during changing healthcare conditions.

    PubMed

    Bantar, C; Franco, D; Heft, C; Vesco, E; Arango, C; Izaguirre, M; Alcázar, G; Boleas, M; Oliva, M E

    2005-06-01

    We recently published on the impact of a four-phase hospital-wide intervention program designed to optimize the quality of antibiotic use, where a multidisciplinary team (MDT) could modify prescription at the last phase. Because health care quality was changing during the last 5 years (late 1999 to early 2004), we developed certain indicators to monitor the quality of our intervention over time. Different periods were defined as baseline (pre-intervention), initial intervention-active control, pre-crisis control, crisis control, post-crisis control and end of crisis control. Major indicators were rates of prescription modification by the MDT; prescription for an uncertain infection and a novel index formula (RIcarb) to estimate the rationale for carbapenem use. We assessed 2115 antimicrobial prescriptions. Modification of prescription rate was 30% at the beginning and decreased thereafter up to stable levels. Rate of prescriptions ordered for cases of both uncertain infection and unknown source of infection decreased significantly after intervention (i.e. from baseline to active control). In contrast, a doubling of culture-directed prescriptions was observed between these periods. RIcarb values lower and higher than 60% (modal, cut-off) were assumed as carbapenem overuse and underuse, respectively. Overuse was observed at the pre-intervention, while pronounced underuse was shown during the crisis (RIcarb, 45% and 87%, respectively). The present study demonstrates that certain indicators, other than the widely adopted impact outcomes, are a suitable tool for monitoring the quality of a continuous, long-term, active intervention on antimicrobial prescribing practice, especially when applied in a changing healthcare setting.

  16. Development of NIRS method for quality control of drug combination artesunate–azithromycin for the treatment of severe malaria

    PubMed Central

    Boyer, Chantal; Gaudin, Karen; Kauss, Tina; Gaubert, Alexandra; Boudis, Abdelhakim; Verschelden, Justine; Franc, Mickaël; Roussille, Julie; Boucher, Jacques; Olliaro, Piero; White, Nicholas J.; Millet, Pascal; Dubost, Jean-Pierre

    2012-01-01

    Near infrared spectroscopy (NIRS) methods were developed for the determination of analytical content of an antimalarial-antibiotic (artesunate and azithromycin) co-formulation in hard gelatin capsule (HGC). The NIRS consists of pre-processing treatment of spectra (raw spectra and first-derivation of two spectral zones), a unique principal component analysis model to ensure the specificity and then two partial least-squares regression models for the determination content of each active pharmaceutical ingredient. The NIRS methods were developed and validated with no reference method, since the manufacturing process of HGC is basically mixed excipients with active pharmaceutical ingredients. The accuracy profiles showed β-expectation tolerance limits within the acceptance limits (±5%). The analytical control approach performed by reversed phase (HPLC) required two different methods involving two different preparation and chromatographic methods. NIRS offers advantages in terms of lower costs of equipment and procedures, time saving, environmentally friendly. PMID:22579599

  17. The 2-D Ion Chromatography Development and Application: Determination of Sulfate in Formation Water at Pre-Salt Region

    NASA Astrophysics Data System (ADS)

    Tonietto, G. B.; Godoy, J. M.; Almeida, A. C.; Mendes, D.; Soluri, D.; Leite, R. S.; Chalom, M. Y.

    2015-12-01

    Formation water is the naturally-occurring water which is contained within the geological formation itself. The quantity and quality of the formation water can both be problematic. Over time, the water volume should decrease as the gas volumes increase. Formation water has been found to contain high levels of Cl, As, Fe, Ba, Mn, PAHs and may even contain naturally occurring radioactive materials. Chlorides in some cases have been found to be in excess of four-five times the level of concentrations found in the ocean. Within the management of well operation, there is sulfate between the analytes of greatest importance due to the potential for hydrogen sulphide formation and consequent corrosion of pipelines. As the concentration of sulfate in these waters can be less than n times that of chloride, a quantitative determination, using the technique of ion chromatography, constitutes an analytical challenge. This work aimed to develop and validate a method for the determination of sulphate ions in hyper-saline waters coming from the oil wells of the pre-salt, using 2D IC. In 2D IC the first column can be understood as a separating column, in which the species with retention times outside a preset range are discarded, while those belonging to this range are retained in a pre-concentrator column to further injecting a second column, the second dimension in which occurs the separation and quantification of the analytes of interest. As the chloride ions have a retention time lower than that of sulfate, a method was developed a for determining sulfate in very low range (mg L-1) by 2D IC, applicable to hypersaline waters, wherein the first dimension is used to the elimination of the matrix, ie, chloride ions, and the second dimension utilized in determining sulfate. For sulphate in a concentration range from 1.00 mg L-1 was obtained an accuracy of 1.0%. The accuracy of the method was tested by the standard addition method different samples of formation water in the pre-salt region, having been a relative error less than 1.0% at a concentration of 5.0 mg L-1.This work allowed the expected achievement of sulfate results for hyper-saline samples such as those found in the pre-salt exploration. Studies are being developed in order to validate the determination of bromide in the pre-salt water, using the 2D liquid chromatography.

  18. Analytic versus systemic group therapy for women with a history of child sexual abuse: 1-year follow-up of a randomized controlled trial.

    PubMed

    Elkjaer, Henriette; Kristensen, Ellids; Mortensen, Erik L; Poulsen, Stig; Lau, Marianne

    2014-06-01

    This randomized prospective study examines durability of improvement in general symptomatology, psychosocial functioning and interpersonal problems, and compares the long-term efficacy of analytic and systemic group psychotherapy in women 1 year after completion of treatment for childhood sexual abuse. Women (n = 106) randomly assigned to analytic or systemic psychotherapy completed the Symptom Checklist-90-R, Global Assessment of Functioning, Global Life Quality, Registration Chart Questionnaire, and Flashback Registration at pre-treatment, post-treatment, and at a 1-year follow-up. Post-treatment gains were significant for both treatment modalities on all measures, but significantly larger after systemic therapy. Significant treatment response was maintained 1-year post-treatment, but different trajectories were observed: 1 year after treatment completion, improvements for analytic therapy were maintained, whereas they decreased after systemic therapy, resulting in no statistically significant difference in gains between the groups at the 1-year follow-up. Despite maintaining significant gains, more than half of the patients remained above cut-off for caseness concerning general symptomatology at post-treatment and at 1-year follow-up. The findings stress the importance of long-term follow-up data in effect studies. Different trajectories were associated with the two treatments, but improvement in the two treatment groups did not differ significantly at the 1-year follow-up. Implications of the difference in trajectories for treatment planning are discussed. Both analytic and systemic group therapy proved efficient in improving general symptomatology, psychosocial functioning, and interpersonal problems in women with a history of CSA and gains were maintained at a 1-year follow-up. Despite maintaining statistically significant gains at the 1-year follow-up, 54% of the patients remained above the cut-off for caseness with respect to general symptomatology, which may indicate a need for further treatment. Different pre-post follow-up treatment trajectories were observed between the two treatment modalities. Thus, while systemic group therapy showed a significantly better outcome immediately after termination, gains in the systemic treatment group decreased during follow-up, while gains were maintained during follow-up in analytic group therapy. © 2013 The British Psychological Society.

  19. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  1. A systematic review of online resources to support patient decision-making for full-thickness rectal prolapse surgery.

    PubMed

    Fowler, G E; Baker, D M; Lee, M J; Brown, S R

    2017-11-01

    The internet is becoming an increasingly popular resource to support patient decision-making outside of the clinical encounter. The quality of online health information is variable and largely unregulated. The aim of this study was to assess the quality of online resources to support patient decision-making for full-thickness rectal prolapse surgery. This systematic review was registered on the PROSPERO database (CRD42017058319). Searches were performed on Google and specialist decision aid repositories using a pre-defined search strategy. Sources were analysed according to three measures: (1) their readability using the Flesch-Kincaid Reading Ease score, (2) DISCERN score and (3) International Patient Decision Aids Standards (IPDAS) minimum standards criteria score (IPDASi, v4.0). Overall, 95 sources were from Google and the specialist decision aid repositories. There were 53 duplicates removed, and 18 sources did not meet the pre-defined eligibility criteria, leaving 24 sources included in the full-text analysis. The mean Flesch-Kincaid Reading Ease score was higher than recommended for patient education materials (48.8 ± 15.6, range 25.2-85.3). Overall quality of sources supporting patient decision-making for full-thickness rectal prolapse surgery was poor (median DISCERN score 1/5 ± 1.18, range 1-5). No sources met minimum decision-making standards (median IPDASi score 5/12 ± 2.01, range 1-8). Currently, easily accessible online health information to support patient decision-making for rectal surgery is of poor quality, difficult to read and does not support shared decision-making. It is recommended that professional bodies and medical professionals seek to develop decision aids to support decision-making for full-thickness rectal prolapse surgery.

  2. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  3. Using constraints and their value for optimization of large ODE systems

    PubMed Central

    Domijan, Mirela; Rand, David A.

    2015-01-01

    We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300

  4. Microbial ecology laboratory procedures manual NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  5. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  6. Determining your organization's 'risk capability'.

    PubMed

    Hannah, Bill; Hancock, Melinda

    2014-05-01

    An assessment of a provider's level of risk capability should focus on three key elements: Business intelligence, including sophisticated analytical models that can offer insight into the expected cost and quality of care for a given population. Clinical enterprise maturity, marked by the ability to improve health outcomes and to manage utilization and costs to drive change. Revenue transformation, emphasizing the need for a revenue cycle platform that allows for risk acceptance and management and that provides incentives for performance against defined objectives.

  7. Quality control of the tribological coating PS212

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Dellacorte, Christopher; Deadmore, Daniel L.

    1989-01-01

    PS212 is a self-lubricating, composite coating that is applied by the plasma spray process. It is a functional lubricating coating from 25 C (or lower) to 900 C. The coating is prepared from a blend of three different powders with very dissimilar properties. Therefore, the final chemical composition and lubricating effectiveness of the coatings are very sensitive to the process variables used in their preparation. Defined here are the relevant variables. The process and analytical procedures that will result in satisfactory tribological coatings are discussed.

  8. The Spanish external quality assessment scheme for mercury in urine.

    PubMed

    Quintana, M J; Mazarrasa, O

    1996-01-01

    In 1986 the Instituto Nacional de Seguridad e Higiene en el Trabajo (INSHT), established the "Programa interlaboratorios de control de calidad de mercurio en orina (PICC-HgU)". The operation of this scheme is explained, criteria for evaluation of laboratory performance are defined and some results obtained are reviewed. Since the scheme started, an improvement in the overall performance of laboratories has been observed. The differences in the analytical methods used by laboratories do not seem to have a clear influence on the results.

  9. Harmonization activities of Noklus - a quality improvement organization for point-of-care laboratory examinations.

    PubMed

    Stavelin, Anne; Sandberg, Sverre

    2018-05-16

    Noklus is a non-profit quality improvement organization that focuses to improve all elements in the total testing process. The aim is to ensure that all medical laboratory examinations are ordered, performed and interpreted correctly and in accordance with the patients' needs for investigation, treatment and follow-up. For 25 years, Noklus has focused on point-of-care (POC) testing in primary healthcare laboratories and has more than 3100 voluntary participants. The Noklus quality system uses different tools to obtain harmonization and improvement: (1) external quality assessment for the pre-examination, examination and postexamination phase to monitor the harmonization process and to identify areas that need improvement and harmonization, (2) manufacturer-independent evaluations of the analytical quality and user-friendliness of POC instruments and (3) close interactions and follow-up of the participants through site visits, courses, training and guidance. Noklus also recommends which tests that should be performed in the different facilities like general practitioner offices, nursing homes, home care, etc. About 400 courses with more than 6000 delegates are organized annually. In 2017, more than 21,000 e-learning programs were completed.

  10. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    PubMed

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  11. Simultaneous analysis of cerebrospinal fluid biomarkers using microsphere-based xMAP multiplex technology for early detection of Alzheimer's disease.

    PubMed

    Kang, Ju-Hee; Vanderstichele, Hugo; Trojanowski, John Q; Shaw, Leslie M

    2012-04-01

    The xMAP-Luminex multiplex platform for measurement of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers using Innogenetics AlzBio3 immunoassay reagents that are for research use only has been shown to be an effective tool for early detection of an AD-like biomarker signature based on concentrations of CSF Aβ(1-42), t-tau and p-tau(181). Among the several advantages of the xMAP-Luminex platform for AD CSF biomarkers are: a wide dynamic range of ready-to-use calibrators, time savings for the simultaneous analyses of three biomarkers in one analytical run, reduction of human error, potential of reduced cost of reagents, and a modest reduction of sample volume as compared to conventional enzyme-linked immunosorbant assay (ELISA) methodology. Recent clinical studies support the use of CSF Aβ(1-42), t-tau and p-tau(181) measurement using the xMAP-Luminex platform for the early detection of AD pathology in cognitively normal individuals, and for prediction of progression to AD dementia in subjects with mild cognitive impairment (MCI). Studies that have shown the prediction of risk for progression to AD dementia by MCI patients provide the basis for the use of CSF Aβ(1-42), t-tau and p-tau(181) testing to assign risk for progression in patients enrolled in therapeutic trials. Furthermore emerging study data suggest that these pathologic changes occur in cognitively normal subjects 20 or more years before the onset of clinically detectable memory changes thus providing an objective measurement for use in the assessment of treatment effects in primary treatment trials. However, numerous previous ELISA and Luminex-based multiplex studies reported a wide range of absolute values of CSF Aβ(1-42), t-tau and p-tau(181) indicative of substantial inter-laboratory variability as well as varying degrees of intra-laboratory imprecision. In order to address these issues a recent inter-laboratory investigation that included a common set of CSF pool aliquots from controls as well as AD patients over a range of normal and pathological Aβ(1-42), t-tau and p-tau(181) values as well as agreed-on standard operating procedures (SOPs) assessed the reproducibility of the multiplex methodology and Innogenetics AlzBio3 immunoassay reagents. This study showed within-center precision values of 5% to a little more than 10% and good inter-laboratory %CV values (10-20%). There are several likely factors influencing the variability of CSF Aβ(1-42), t-tau and p-tau(181) measurements. In this review, we describe the pre-analytical, analytical and post-analytical sources of variability including sources inherent to kits, and describe procedures to decrease the variability. A CSF AD biomarker Quality Control program has been established and funded by the Alzheimer Association, and global efforts are underway to further define optimal pre-analytical SOPs and best practices for the methodologies available or in development including plans for production of a standard reference material that could provide for a common standard against which manufacturers of immunoassay kits would assign calibration standard values. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Early Discharge Planning and Improved Care Transitions: Pre-Admission Assessment for Readmission Risk in an Elective Orthopedic and Cardiovascular Surgical Population

    PubMed Central

    Mola, Ana; Rosenfeld, Peri; Ford, Shauna

    2016-01-01

    Background/Methods: Readmission prevention is a marker of patient care quality and requires comprehensive, early discharge planning for safe hospital transitions. Effectively performed, this process supports patient satisfaction, efficient resource utilization, and care integration. This study developed/tested the utility of a predictive early discharge risk assessment with 366 elective orthopedic/cardiovascular surgery patients. Quality improvement cycles were undertaken for the design and to inform analytic plan. An 8-item questionnaire, which includes patient self-reported health, was integrated into care managers’ telephonic pre-admission assessments during a 12-month period. Results: Regression models found the questionnaire to be predictive of readmission (p ≤ .005; R2 = .334) and length-of-stay (p ≤ .001; R2 = .314). Independent variables of “lives-alone” and “self-rated health” were statistically significant for increased readmission odds, as was “self-rated health” for increased length-of-stay. Quality measures, patient experience and increased rates of discharges-to-home further supported the benefit of embedding these questions into the pro-active planning process. Conclusion: The pilot discharge risk assessment was predictive of readmission risk and length-of-stay for elective orthopedic/cardiovascular patients. Given the usability of the questionnaire in advance of elective admissions, it can facilitate pro-active discharge planning essential for producing quality outcomes and addressing new reimbursement methodologies for continuum-based episodes of care. PMID:27616965

  13. Using baldrige performance excellence program approaches in the pursuit of radiation oncology quality care, patient satisfaction, and workforce commitment.

    PubMed

    Sternick, Edward S

    2011-01-01

    The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance US business competitiveness and economic growth. Administered by the National Institute of Standards and Technology, the Act created the Baldrige National Quality Program, recently renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well-suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact-based, knowledge-driven system for improving quality of care, increasing patient satisfaction, enhancing leadership effectiveness, building employee engagement, and boosting organizational innovation. This methodology also provides a valuable framework for benchmarking an individual radiation oncology practice's operations and results against guidelines defined by accreditation and professional organizations and regulatory agencies.

  14. Sponsor relationships, analyte stability in ligand-binding assays and critical reagent management: a bioanalytical CRO perspective.

    PubMed

    Lefor Bradford, Julia

    2015-01-01

    This perspective article discusses key points to address in the establishment of sound partnerships between sponsors and bioanalytical CROs to assure the timeliness, quality and consistency of bioanalysis throughout biological therapeutic development. The performance of ligand-binding assays can be greatly impacted by low-grade reagents, lot-to-lot variability and lack of stability of the analyte in matrix, impacting both timelines and cost. Thorough characterization of the biologic of interest and its assay-enabling critical reagents will lend itself well to conservation of materials and continuity of assay performance. When unplanned events occur, such as performance declines or premature depletion of material, structured procedures are paramount to supplement the current loosely defined regulatory guidance on critical reagent characterization and method bridging.

  15. Membrane inlet mass spectrometry of volatile organohalogen compounds in drinking water.

    PubMed

    Bocchini, P; Pozzi, R; Andalò, C; Galletti, G C

    1999-01-01

    The analysis of organic pollutants in drinking water is a topic of wide interest, reflecting on public health and life quality. Many different methodologies have been developed and are currently employed in this context, but they often require a time-consuming sample pre-treatment. This step affects the recovery of the highly volatile compounds. Trace analysis of volatile organic pollutants in water can be performed 'on-line' by membrane inlet mass spectrometry (MIMS). In MIMS, the sample is separated from the vacuum of the mass spectrometer by a thin polymeric hollow-fibre membrane. Gases and organic volatile compounds diffuse and concentrate from the sample into the hollow-fibre membrane, and from there into the mass spectrometer. The main advantages of the technique are that no pre-treatment of samples before analysis is needed and that it has fast response times and on-line monitoring capabilities. This paper reports the set-up of the analytical conditions for the analysis of volatile organohalogen compounds (chloroform, bromoform, bromodichloromethane, chlorodibromomethane, tetrachloroethylene, trichloroethylene, 1,1,1-trichloroethane, and carbon tetrachloride). Linearity of response, repeatability, detection limits, and spectra quality are evaluated. Copyright 1999 John Wiley & Sons, Ltd.

  16. Assessment of lower urinary tract symptoms in different stages of menopause.

    PubMed

    Varella, Larissa Ramalho Dantas; Bezerra da Silva, Rossânia; Eugênia de Oliveira, Maria Clara; Melo, Priscylla Hellouyse Angelo; Maranhão, Técia Maria de Oliveira; Micussi, Maria Thereza Albuquerque Barbosa Cabral

    2016-11-01

    [Purpose] To assess lower urinary tract symptoms in different stages of menopause and the quality of life of females with incontinence. [Subjects and Methods] The sample consisted of 302 females, aged between 40 and 56 years, divided into three groups: PRE (n= 81), PERI (n= 108) and POST (n= 113). This was a cross-sectional, analytical, observational study. Data were collected by assessment chart and conducting the International Consultation on Incontinence Questionnaire - Short Form. [Results] Most of the women had less than 10 years of schooling and were married. In PERI and POST menopause, the most frequent lower urinary tract symptoms were urinary urgency and stress incontinence. The PRE group did not exhibit nocturia, urge incontinence or urinary urgency, and had the lowest symptoms frequency. In the three stages, stress incontinence was the most prevalent symptom. Of the three menopause stages, PERI had a greater impact on urinary incontinence according to the International Consultation on Incontinence Questionnaire. [Conclusion] The presence of lower urinary tract symptoms can vary across the different stages of menopause and the urinary incontinence was the most frequent complaint. Moreover, it was observed that quality of life was more affected in the perimenopause stage.

  17. Assessment of lower urinary tract symptoms in different stages of menopause

    PubMed Central

    Varella, Larissa Ramalho Dantas; Bezerra da Silva, Rossânia; Eugênia de Oliveira, Maria Clara; Melo, Priscylla Hellouyse Angelo; Maranhão, Técia Maria de Oliveira; Micussi, Maria Thereza Albuquerque Barbosa Cabral

    2016-01-01

    [Purpose] To assess lower urinary tract symptoms in different stages of menopause and the quality of life of females with incontinence. [Subjects and Methods] The sample consisted of 302 females, aged between 40 and 56 years, divided into three groups: PRE (n= 81), PERI (n= 108) and POST (n= 113). This was a cross-sectional, analytical, observational study. Data were collected by assessment chart and conducting the International Consultation on Incontinence Questionnaire-Short Form. [Results] Most of the women had less than 10 years of schooling and were married. In PERI and POST menopause, the most frequent lower urinary tract symptoms were urinary urgency and stress incontinence. The PRE group did not exhibit nocturia, urge incontinence or urinary urgency, and had the lowest symptoms frequency. In the three stages, stress incontinence was the most prevalent symptom. Of the three menopause stages, PERI had a greater impact on urinary incontinence according to the International Consultation on Incontinence Questionnaire. [Conclusion] The presence of lower urinary tract symptoms can vary across the different stages of menopause and the urinary incontinence was the most frequent complaint. Moreover, it was observed that quality of life was more affected in the perimenopause stage. PMID:27942131

  18. Pre-analytical Factors Influence Accuracy of Urine Spot Iodine Assessment in Epidemiological Surveys.

    PubMed

    Doggui, Radhouene; El Ati-Hellal, Myriam; Traissac, Pierre; El Ati, Jalila

    2018-03-26

    Urinary iodine concentration (UIC) is commonly used to assess iodine status of subjects in epidemiological surveys. As pre-analytical factors are an important source of measurement error and studies about this phase are scarce, our objective was to assess the influence of urine sampling conditions on UIC, i.e., whether the child ate breakfast or not, urine void rank of the day, and time span between last meal and urine collection. A nationwide, two-stage, stratified, cross-sectional study including 1560 children (6-12 years) was performed in 2012. UIC was determined by the Sandell-Kolthoff method. Pre-analytical factors were assessed from children's mothers by using a questionnaire. Association between iodine status and pre-analytical factors were adjusted for one another and socio-economic characteristics by multivariate linear and multinomial regression models (RPR: relative prevalence ratios). Skipping breakfast prior to morning urine sampling decreased UIC by 40 to 50 μg/L and the proportion of UIC < 100 μg/L was higher among children having those skipped breakfast (RPR = 3.2[1.0-10.4]). In unadjusted analyses, UIC was less among children sampled more than 5 h from their last meal. UIC decreased with rank of urine void (e.g., first vs. second, P < 0.001); also, the proportion of UIC < 100 μg/L was greater among 4th rank samples (vs. second RPR = 2.1[1.1-4.0]). Subjects' breakfast status and urine void rank should be accounted for when assessing iodine status. Providing recommendations to standardize pre-analytical factors is a key step toward improving accuracy and comparability of survey results for assessing iodine status from spot urine samples. These recommendations have to be evaluated by future research.

  19. Safety and quality of food contact materials. Part 1: evaluation of analytical strategies to introduce migration testing into good manufacturing practice.

    PubMed

    Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N

    2002-02-01

    The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.

  20. Methods of Analysis by the U.S. Geological Survey National Water Quality Laboratory - Determination of Moderate-Use Pesticides and Selected Degradates in Water by C-18 Solid-Phase Extraction and Gas Chromatography/Mass Spectrometry

    USGS Publications Warehouse

    Sandstrom, Mark W.; Stroppel, Max E.; Foreman, William T.; Schroeder, Michael P.

    2001-01-01

    A method for the isolation and analysis of 21 parent pesticides and 20 pesticide degradates in natural-water samples is described. Water samples are filtered to remove suspended particulate matter and then are pumped through disposable solid-phase-extraction columns that contain octadecyl-bonded porous silica to extract the analytes. The columns are dried by using nitrogen gas, and adsorbed analytes are eluted with ethyl acetate. Extracted analytes are determined by capillary-column gas chromatography/mass spectrometry with selected-ion monitoring of three characteristic ions. The upper concentration limit is 2 micrograms per liter (?g/L) for most analytes. Single-operator method detection limits in reagent-water samples range from 0.00 1 to 0.057 ?g/L. Validation data also are presented for 14 parent pesticides and 20 degradates that were determined to have greater bias or variability, or shorter holding times than the other compounds. The estimated maximum holding time for analytes in pesticide-grade water before extraction was 4 days. The estimated maximum holding time for analytes after extraction on the dry solid-phase-extraction columns was 7 days. An optional on-site extraction procedure allows for samples to be collected and processed at remote sites where it is difficult to ship samples to the laboratory within the recommended pre-extraction holding time. The method complements existing U.S. Geological Survey Method O-1126-95 (NWQL Schedules 2001 and 2010) by using identical sample preparation and comparable instrument analytical conditions so that sample extracts can be analyzed by either method to expand the range of analytes determined from one water sample.

  1. Experience of quality management system in a clinical laboratory in Nigeria

    PubMed Central

    Sylvester-Ikondu, Ugochukwu; Onwuamah, Chika K.; Salu, Olumuyiwa B.; Ige, Fehintola A.; Meshack, Emily; Aniedobe, Maureen; Amoo, Olufemi S.; Okwuraiwe, Azuka P.; Okhiku, Florence; Okoli, Chika L.; Fasela, Emmanuel O.; Odewale, Ebenezer. O.; Aleshinloye, Roseline O.; Olatunji, Micheal; Idigbe, Emmanuel O.

    2012-01-01

    Issues Quality-management systems (QMS) are uncommon in clinical laboratories in Nigeria, and until recently, none of the nation’s 5 349 clinical laboratories have been able to attain the certifications necessary to begin the process of attaining international accreditation. Nigeria’s Human Virology Laboratory (HVL), however, began implementation of a QMS in 2006, and in 2008 it was determined that the laboratory conformed to the requirements of ISO 9001:2000 (now 2008), making it the first diagnostic laboratory to be certified in Nigeria. The HVL has now applied for the World Health Organization (WHO) accreditation preparedness scheme. The experience of the QMS implementation process and the lessons learned therein are shared here. Description In 2005, two personnel from the HVL spent time studying quality systems in a certified clinical laboratory in Dakar, Senegal. Following this peer-to-peer technical assistance, several training sessions were undertaken by HVL staff, a baseline assessment was conducted, and processes were established. The HVL has monitored its quality indicators and conducted internal and external audits; these analyses (from 2007 to 2009) are presented herein. Lessons learned Although there was improvement in the pre-analytical and analytical indicators analysed and although data-entry errors decreased in the post-analytical process, the delay in returning laboratory test results increased significantly. There were several factors identified as causes for this delay and all of these have now been addressed except for an identified need for automation of some high-volume assays (currently being negotiated). Internal and external audits showed a trend of increasing non-conformities which could be the result of personnel simply becoming lax over time. Application for laboratory accreditation, however, could provide the renewed vigour needed to correct these non-conformities. Recommendation This experience shows that sustainability of the QMS at present is a cause for concern. However, the tiered system of accreditation being developed by WHO–Afro may act as a driving force to preserve the spirit of continual improvement. PMID:29062734

  2. Estimating Cloud optical thickness from SEVIRI, for air quality research, by implementing a semi-analytical cloud retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Pandey, Praveen; De Ridder, Koen; van Looy, Stijn; van Lipzig, Nicole

    2010-05-01

    Clouds play an important role in Earth's climate system. As they affect radiation hence photolysis rate coefficients (ozone formation),they also affect the air quality at the surface of the earth. Thus, a satellite remote sensing technique is used to retrieve the cloud properties for air quality research. The geostationary satellite, Meteosat Second Generation (MSG) has onboard, the Spinning Enhanced Visible and Infrared Imager (SEVIRI). The channels in the wavelength 0.6 µm and 1.64 µm are used to retrieve cloud optical thickness (COT). The study domain is over Europe covering a region between 35°N-70°N and 5°W-30°E, centred over Belgium. The steps involved in pre-processing the EUMETSAT level 1.5 images are described, which includes, acquisition of digital count number, radiometric conversion using offsets and slopes, estimation of radiance and calculation of reflectance. The Sun-earth-satellite geometry also plays an important role. A semi-analytical cloud retrieval algorithm (Kokhanovsky et al., 2003) is implemented for the estimation of COT. This approach doesn't involve the conventional look-up table approach, hence it makes the retrieval independent of numerical radiative transfer solutions. The semi-analytical algorithm is implemented on a monthly dataset of SEVIRI level 1.5 images. Minimum reflectance in the visible channel, at each pixel, during the month is accounted as the surface albedo of the pixel. Thus, monthly variation of COT over the study domain is prepared. The result so obtained, is compared with the COT products of Satellite Application Facility on Climate Monitoring (CM SAF). Henceforth, an approach to assimilate the COT for air quality research is presented. Address of corresponding author: Praveen Pandey, VITO- Flemish Institute for Technological Research, Boeretang 200, B 2400, Mol, Belgium E-mail: praveen.pandey@vito.be

  3. Framework for the quality assurance of 'omics technologies considering GLP requirements.

    PubMed

    Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben

    2017-12-01

    'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Enabling nutrient security and sustainability through systems research.

    PubMed

    Kaput, Jim; Kussmann, Martin; Mendoza, Yery; Le Coutre, Ronit; Cooper, Karen; Roulin, Anne

    2015-05-01

    Human and companion animal health depends upon nutritional quality of foods. Seed varieties, seasonal and local growing conditions, transportation, food processing, and storage, and local food customs can influence the nutrient content of food. A new and intensive area of investigation is emerging that recognizes many factors in these agri-food systems that influence the maintenance of nutrient quality which is fundamental to ensure nutrient security for world populations. Modeling how these systems function requires data from different sectors including agricultural, environmental, social, and economic, but also must incorporate basic nutrition and other biomedical sciences. Improving the agri-food system through advances in pre- and post-harvest processing methods, biofortification, or fortifying processed foods will aid in targeting nutrition for populations and individuals. The challenge to maintain and improve nutrient quality is magnified by the need to produce food locally and globally in a sustainable and consumer-acceptable manner for current and future populations. An unmet requirement for assessing how to improve nutrient quality, however, is the basic knowledge of how to define health. That is, health cannot be maintained or improved by altering nutrient quality without an adequate definition of what health means for individuals and populations. Defining and measuring health therefore becomes a critical objective for basic nutritional and other biomedical sciences.

  5. TH-D-204-00: The Pursuit of Radiation Oncology Performance Excellence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance U.S. business competitiveness and economic growth. Administered by the National Institute of Standards and Technology NIST, the Act created the Baldrige National Quality Program, now renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact based knowledge driven system for improving quality of care,more » increasing patient satisfaction, building employee engagement, and boosting organizational innovation. The methodology also provides a valuable framework for benchmarking an individual radiation oncology practice against guidelines defined by accreditation and professional organizations and regulatory agencies. Learning Objectives: To gain knowledge of the Baldrige Performance Excellence Program as it relates to Radiation Oncology. To appreciate the value of a multidisciplinary self-assessment approach in the pursuit of Radiation Oncology quality care, patient satisfaction, and workforce commitment. To acquire a set of useful measurement tools with which an individual Radiation Oncology practice can benchmark its performance against guidelines defined by accreditation and professional organizations and regulatory agencies.« less

  6. TH-D-204-01: The Pursuit of Radiation Oncology Performance Excellence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sternick, E.

    The Malcolm Baldrige National Quality Improvement Act was signed into law in 1987 to advance U.S. business competitiveness and economic growth. Administered by the National Institute of Standards and Technology NIST, the Act created the Baldrige National Quality Program, now renamed the Baldrige Performance Excellence Program. The comprehensive analytical approaches referred to as the Baldrige Healthcare Criteria, are very well suited for the evaluation and sustainable improvement of radiation oncology management and operations. A multidisciplinary self-assessment approach is used for radiotherapy program evaluation and development in order to generate a fact based knowledge driven system for improving quality of care,more » increasing patient satisfaction, building employee engagement, and boosting organizational innovation. The methodology also provides a valuable framework for benchmarking an individual radiation oncology practice against guidelines defined by accreditation and professional organizations and regulatory agencies. Learning Objectives: To gain knowledge of the Baldrige Performance Excellence Program as it relates to Radiation Oncology. To appreciate the value of a multidisciplinary self-assessment approach in the pursuit of Radiation Oncology quality care, patient satisfaction, and workforce commitment. To acquire a set of useful measurement tools with which an individual Radiation Oncology practice can benchmark its performance against guidelines defined by accreditation and professional organizations and regulatory agencies.« less

  7. Ion mobility spectrometry for food quality and safety.

    PubMed

    Vautz, W; Zimmermann, D; Hartmann, M; Baumbach, J I; Nolte, J; Jung, J

    2006-11-01

    Ion mobility spectrometry is known to be a fast and sensitive technique for the detection of trace substances, and it is increasingly in demand not only for protection against explosives and chemical warfare agents, but also for new applications in medical diagnosis or process control. Generally, a gas phase sample is ionized by help of ultraviolet light, ss-radiation or partial discharges. The ions move in a weak electrical field towards a detector. During their drift they collide with a drift gas flowing in the opposite direction and, therefore, are slowed down depending on their size, shape and charge. As a result, different ions reach the detector at different drift times, which are characteristic for the ions considered. The number of ions reaching the detector are a measure of the concentration of the analyte. The method enables the identification and quantification of analytes with high sensitivity (ng l(-1) range). The selectivity can even be increased - as necessary for the analyses of complex mixtures - using pre-separation techniques such as gas chromatography or multi-capillary columns. No pre-concentration of the sample is necessary. Those characteristics of the method are preserved even in air with up to a 100% relative humidity rate. The suitability of the method for application in the field of food quality and safety - including storage, process and quality control as well as the characterization of food stuffs - was investigated in recent years for a number of representative examples, which are summarized in the following, including new studies as well: (1) the detection of metabolites from bacteria for the identification and control of their growth; (2) process control in food production - beer fermentation being an example; (3) the detection of the metabolites of mould for process control during cheese production, for quality control of raw materials or for the control of storage conditions; (4) the quality control of packaging materials during the production of polymeric materials; and (5) the characterization of products - wine being an example. The challenges of such applications were operation in humid air, fast on-line analyses of complex mixtures, high sensitivity - detection limits have to be, for example, in the range of the odour limits - and, in some cases, the necessity of mobile instrumentation. It can be shown that ion mobility spectrometry is optimally capable of fulfilling those challenges for many applications.

  8. Testing the hospital value proposition: an empirical analysis of efficiency and quality.

    PubMed

    Huerta, Timothy R; Ford, Eric W; Peterson, Lori T; Brigham, Keith H

    2008-01-01

    To assess the relationship between hospitals' X-inefficiency levels and overall care quality based on the National Quality Forum's 27 safe practices score and to improve the analytic strategy for assessing X-inefficiency. The 2005 versions of the American Hospital Association and Leapfrog Group's annual surveys were the basis of the study. Additional case mix indices and market variables were drawn from the Centers for Medicare and Medicaid Services data sources and the Area Resource File. Data envelopment analysis was used to determine hospitals' X-inefficiency scores relative to their market-level competitors. Regression was used to assess the relationship between X-inefficiency and quality, controlling for organizational and market characteristics. Expenses (total and labor expenditures), case-mix-adjusted admissions, length of stay, and licensed beds defined the X-inefficiency function. The overall National Quality Forum's safe practice score, health maintenance organization penetration, market share, and teaching status served as independent control variables in the regression. The National Quality Forum's safe practice scores are significantly and positively correlated to hospital X-inefficiency levels (beta = .105, p < or = .05). The analysis of the value proposition had very good explanatory power (adjusted R(2) = .414; p < or = .001; df = 7, 265). Contrary to earlier findings, health maintenance organization penetration and being a teaching hospital were positively related to X-inefficiency. Similar with others' findings, greater market share and for-profit ownership were negatively associated with X-inefficiency. Measurement of overall hospital quality is improving but can still be made better. Nevertheless, the National Quality Forum's measure is significantly related to efficiency and could be used to create differential pay-for-performance programs. A market-segmented analytic strategy for studying hospitals' efficiency yields results with a high degree of explanatory power.

  9. Pre-trial quality assurance processes for an intensity-modulated radiation therapy (IMRT) trial: PARSPORT, a UK multicentre Phase III trial comparing conventional radiotherapy and parotid-sparing IMRT for locally advanced head and neck cancer.

    PubMed

    Clark, C H; Miles, E A; Urbano, M T Guerrero; Bhide, S A; Bidmead, A M; Harrington, K J; Nutting, C M

    2009-07-01

    The purpose of this study was to compare conventional radiotherapy with parotid gland-sparing intensity-modulated radiation therapy (IMRT) using the PARSPORT trial. The validity of such a trial depends on the radiotherapy planning and delivery meeting a defined standard across all centres. At the outset, many of the centres had little or no experience of delivering IMRT; therefore, quality assurance processes were devised to ensure consistency and standardisation of all processes for comparison within the trial. The pre-trial quality assurance (QA) programme and results are described. Each centre undertook exercises in target volume definition and treatment planning, completed a resource questionnaire and produced a process document. Additionally, the QA team visited each participating centre. Each exercise had to be accepted before patients could be recruited into the trial. 10 centres successfully completed the quality assurance exercises. A range of treatment planning systems, linear accelerators and delivery methods were used for the planning exercises, and all the plans created reached the standard required for participation in this multicentre trial. All 10 participating centres achieved implementation of a comprehensive and robust IMRT programme for treatment of head and neck cancer.

  10. Effects of Functional Analytic Psychotherapy Therapist Training on Therapist Factors Among Therapist Trainees in Singapore: A Randomized Controlled Trial.

    PubMed

    Keng, Shian-Ling; Waddington, Emma; Lin, Xiangting Bernice; Tan, Michelle Su Qing; Henn-Haase, Clare; Kanter, Jonathan W

    2017-07-01

    Functional Analytic Psychotherapy (FAP) is a behavioral psychotherapy intervention that emphasizes the development of an intimate and intense therapeutic relationship as the vehicle of therapeutic change. Recently, research has provided preliminary support for a FAP therapist training (FAPTT) protocol in enhancing FAP competency. The present study aimed to expand on this research by examining the effects of FAPTT on FAP-specific skills and competencies and a set of broadly desirable therapist qualities (labelled awareness, courage and love in FAPTT) in a sample of therapist trainees in Singapore. The study also evaluated the feasibility and acceptability of FAP in the Singaporean context. Twenty-five students enrolled in a master's in clinical psychology program were recruited and randomly assigned to receive either eight weekly sessions of a FAPTT course or to a waitlist condition. All participants completed measures assessing empathy, compassionate love, trait mindfulness, authenticity and FAP-specific skills and competencies pre- and post-training, and at 2-month follow-up. A post-course evaluation was administered to obtain participants' qualitative feedback. Results indicated that compared with the waitlisted group, FAPTT participants reported significant increases in overall empathy, FAP skill and treatment acceptability from pre- to post-training. Improvements were observed on several outcome variables at 2-month follow-up. Participants reported finding the training to be both feasible and acceptable, although several raised issues related to the compatibility of the treatment with the local cultural context. Overall, the findings suggest that FAPTT is effective for improving specific FAP competencies and selected broadly desirable therapist qualities among therapist trainees. Copyright © 2016 John Wiley & Sons, Ltd. Functional Analytic Therapy (FAP) therapist training protocol was effective in improving empathy and FAP skills among Singaporean therapist trainees. These improvements were maintained at 2-month follow-up. The training was found to be acceptable in the Singaporean context, although several adaptations were suggested to increase the compatibility between FAP principles and local cultural norms. Copyright © 2016 John Wiley & Sons, Ltd.

  11. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data

    PubMed Central

    Kahn, Michael G.; Callahan, Tiffany J.; Barnard, Juliana; Bauck, Alan E.; Brown, Jeff; Davidson, Bruce N.; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G.; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C.; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G.; Weng, Chunhua; Zozus, Meredith N.; Schilling, Lisa

    2016-01-01

    Objective: Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is ‘fit’ for specific uses. Materials and Methods: DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework’s inclusiveness was evaluated against ten published DQ terminologies. Results: Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Discussion: Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. Conclusion: A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods. PMID:27713905

  12. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data.

    PubMed

    Kahn, Michael G; Callahan, Tiffany J; Barnard, Juliana; Bauck, Alan E; Brown, Jeff; Davidson, Bruce N; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G; Weng, Chunhua; Zozus, Meredith N; Schilling, Lisa

    2016-01-01

    Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is 'fit' for specific uses. DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework's inclusiveness was evaluated against ten published DQ terminologies. Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods.

  13. Heparin removal by ecteola-cellulose pre-treatment enables the use of plasma samples for accurate measurement of anti-Yellow fever virus neutralizing antibodies.

    PubMed

    Campi-Azevedo, Ana Carolina; Peruhype-Magalhães, Vanessa; Coelho-Dos-Reis, Jordana Grazziela; Costa-Pereira, Christiane; Yamamura, Anna Yoshida; Lima, Sheila Maria Barbosa de; Simões, Marisol; Campos, Fernanda Magalhães Freire; de Castro Zacche Tonini, Aline; Lemos, Elenice Moreira; Brum, Ricardo Cristiano; de Noronha, Tatiana Guimarães; Freire, Marcos Silva; Maia, Maria de Lourdes Sousa; Camacho, Luiz Antônio Bastos; Rios, Maria; Chancey, Caren; Romano, Alessandro; Domingues, Carla Magda; Teixeira-Carvalho, Andréa; Martins-Filho, Olindo Assis

    2017-09-01

    Technological innovations in vaccinology have recently contributed to bring about novel insights for the vaccine-induced immune response. While the current protocols that use peripheral blood samples may provide abundant data, a range of distinct components of whole blood samples are required and the different anticoagulant systems employed may impair some properties of the biological sample and interfere with functional assays. Although the interference of heparin in functional assays for viral neutralizing antibodies such as the functional plaque-reduction neutralization test (PRNT), considered the gold-standard method to assess and monitor the protective immunity induced by the Yellow fever virus (YFV) vaccine, has been well characterized, the development of pre-analytical treatments is still required for the establishment of optimized protocols. The present study intended to optimize and evaluate the performance of pre-analytical treatment of heparin-collected blood samples with ecteola-cellulose (ECT) to provide accurate measurement of anti-YFV neutralizing antibodies, by PRNT. The study was designed in three steps, including: I. Problem statement; II. Pre-analytical steps; III. Analytical steps. Data confirmed the interference of heparin on PRNT reactivity in a dose-responsive fashion. Distinct sets of conditions for ECT pre-treatment were tested to optimize the heparin removal. The optimized protocol was pre-validated to determine the effectiveness of heparin plasma:ECT treatment to restore the PRNT titers as compared to serum samples. The validation and comparative performance was carried out by using a large range of serum vs heparin plasma:ECT 1:2 paired samples obtained from unvaccinated and 17DD-YFV primary vaccinated subjects. Altogether, the findings support the use of heparin plasma:ECT samples for accurate measurement of anti-YFV neutralizing antibodies. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Signal Enhancement in HPLC/Micro-Coil NMR Using Automated Column Trapping

    PubMed Central

    Djukovic, Danijel; Liu, Shuhui; Henry, Ian; Tobias, Brian; Raftery, Daniel

    2008-01-01

    A new HPLC-NMR system is described that performs analytical separation, pre-concentration, and NMR spectroscopy in rapid succession. The central component of our method is the online pre-concentration sequence that improves the match between post-column analyte peak volume and the micro-coil NMR detection volume. Separated samples are collected on to a C18 guard column with a mobile phase composed of 90% D2O/10% acetonitrile-D3, and back-flashed to the NMR micro-coil probe with 90% acetonitrile-D3/10% D2O. In order to assess the performance of our unit, we separated a standard mixture of 1 mM ibuprofen, naproxen, and phenylbutazone using a commercially available C18 analytical column. The S/N measurements from the NMR acquisitions indicated that we achieved signal enhancement factors up to 10.4 (±1.2)-fold. Furthermore, we observed that pre-concentration factors increased as the injected amount of analyte decreased. The highest concentration enrichment of 14.7 (±2.2)-fold was attained injecting 100 μL solution of 0.2 mM (~4 μg) ibuprofen. PMID:17037915

  15. Irrigation water quality in southern Mexico City based on bacterial and heavy metal analyses

    NASA Astrophysics Data System (ADS)

    Solís, C.; Sandoval, J.; Pérez-Vega, H.; Mazari-Hiriart, M.

    2006-08-01

    Xochimilco is located in southern Mexico City and represents the reminiscence of the pre-Columbian farming system, the "chinampa" agriculture. "Chinampas" are island plots surrounded by a canal network. At present the area is densely urbanized and populated, with various contaminant sources contributing to the water quality degradation. The canal system is recharged by a combination of treated-untreated wastewater, and precipitation during the rainy season. Over 40 agricultural species, including vegetables, cereals and flowers, are produced in the "chinampas". In order to characterize the quality of Xochimilcos' water used for irrigation, spatial and temporal contaminant indicators such as microorganisms and heavy metals were investigated. Bacterial indicators (fecal coliforms, fecal enterococcus) were analyzed by standard analytical procedures, and heavy metals (such as Fe, Cu, Zn and Pb) were analyzed by particle induced X-ray emission (PIXE). The more contaminated sites coincide with the heavily populated areas. Seasonal variation of contaminants was observed, with the higher bacterial counts and heavy metal concentrations reported during the rainy season.

  16. Some Comments on Mapping from Disease-Specific to Generic Health-Related Quality-of-Life Scales

    PubMed Central

    Palta, Mari

    2013-01-01

    An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error–prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. PMID:23337233

  17. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  18. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  19. A disease management programme for patients with diabetes mellitus is associated with improved quality of care within existing budgets.

    PubMed

    Steuten, L M G; Vrijhoef, H J M; Landewé-Cleuren, S; Schaper, N; Van Merode, G G; Spreeuwenberg, C

    2007-10-01

    To assess the impact of a disease management programme for patients with diabetes mellitus (Type 1 and Type 2) on cost-effectiveness, quality of life and patient self-management. By organizing care in accordance with the principles of disease management, it is aimed to increase quality of care within existing budgets. Single-group, pre-post design with 2-year follow-up in 473 patients. Substantial significant improvements in glycaemic control, health-related quality of life (HRQL) and patient self-management were found. No significant changes were detected in total costs of care. The probability that the disease management programme is cost-effective compared with usual care amounts to 74%, expressed in an average saving of 117 per additional life year at 5% improved HRQL. Introduction of a disease management programme for patients with diabetes is associated with improved intermediate outcomes within existing budgets. Further research should focus on long-term cost-effectiveness, including diabetic complications and mortality, in a controlled setting or by using decision-analytic modelling techniques.

  20. Does Elite Sport Degrade Sleep Quality? A Systematic Review.

    PubMed

    Gupta, Luke; Morgan, Kevin; Gilchrist, Sarah

    2017-07-01

    Information on sleep quality and insomnia symptomatology among elite athletes remains poorly systematised in the sports science and medicine literature. The extent to which performance in elite sport represents a risk for chronic insomnia is unknown. The purpose of this systematic review was to profile the objective and experienced characteristics of sleep among elite athletes, and to consider relationships between elite sport and insomnia symptomatology. Studies relating to sleep involving participants described on a pre-defined continuum of 'eliteness' were located through a systematic search of four research databases: SPORTDiscus, PubMed, Science Direct and Google Scholar, up to April 2016. Once extracted, studies were categorised as (1) those mainly describing sleep structure/patterns, (2) those mainly describing sleep quality and insomnia symptomatology and (3) those exploring associations between aspects of elite sport and sleep outcomes. The search returned 1676 records. Following screening against set criteria, a total of 37 studies were identified. The quality of evidence reviewed was generally low. Pooled sleep quality data revealed high levels of sleep complaints in elite athletes. Three risk factors for sleep disturbance were broadly identified: (1) training, (2) travel and (3) competition. While acknowledging the limited number of high-quality evidence reviewed, athletes show a high overall prevalence of insomnia symptoms characterised by longer sleep latencies, greater sleep fragmentation, non-restorative sleep, and excessive daytime fatigue. These symptoms show marked inter-sport differences. Two underlying mechanisms are implicated in the mediation of sport-related insomnia symptoms: pre-sleep cognitive arousal and sleep restriction.

  1. Adaptation of commercial biomarker kits and proposal for 'drug development kits' to support bioanalysis: call for action.

    PubMed

    Islam, Rafiqul; Kar, Sumit; Islam, Clarinda; Farmen, Raymond

    2018-06-01

    There has been an increased use of commercial kits for biomarker measurement, commensurate with the increased demand for biomarkers in drug development. However, in most cases these kits do not meet the quality attributes for use in regulated environment. The process for adaptation of these kits can be frustrating, time consuming and resource intensive. In addition, a lack of harmonized guidance for the validation of biomarker poses a significant challenge in the adaptation of kits in a regulated environment. The purpose of this perspective is to propose a tiered approach to commercial drug development kits with clearly defined quality attributes and to demonstrate how these kits can be adapted to perform analytical validation in a regulated environment.

  2. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  3. Quality-assurance results for routine water analysis in US Geological Survey laboratories, water year 1991

    USGS Publications Warehouse

    Maloney, T.J.; Ludtke, A.S.; Krizman, T.L.

    1994-01-01

    The US. Geological Survey operates a quality- assurance program based on the analyses of reference samples for the National Water Quality Laboratory in Arvada, Colorado, and the Quality of Water Service Unit in Ocala, Florida. Reference samples containing selected inorganic, nutrient, and low ionic-strength constituents are prepared and disguised as routine samples. The program goal is to determine precision and bias for as many analytical methods offered by the participating laboratories as possible. The samples typically are submitted at a rate of approximately 5 percent of the annual environmental sample load for each constituent. The samples are distributed to the laboratories throughout the year. Analytical data for these reference samples reflect the quality of environmental sample data produced by the laboratories because the samples are processed in the same manner for all steps from sample login through data release. The results are stored permanently in the National Water Data Storage and Retrieval System. During water year 1991, 86 analytical procedures were evaluated at the National Water Quality Laboratory and 37 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic (major ion and trace metal) constituent data for water year 1991 indicated analytical imprecision in the National Water Quality Laboratory for 5 of 67 analytical procedures: aluminum (whole-water recoverable, atomic emission spectrometric, direct-current plasma); calcium (atomic emission spectrometric, direct); fluoride (ion-exchange chromatographic); iron (whole-water recoverable, atomic absorption spectrometric, direct); and sulfate (ion-exchange chromatographic). The results for 11 of 67 analytical procedures had positive or negative bias during water year 1991. Analytical imprecision was indicated in the determination of two of the five National Water Quality Laboratory nutrient constituents: orthophosphate as phosphorus and phosphorus. A negative or positive bias condition was indicated in three of five nutrient constituents. There was acceptable precision and no indication of bias for the 14 low ionic-strength analytical procedures tested in the National Water Quality Laboratory program and for the 32 inorganic and 5 nutrient analytical procedures tested in the Quality of Water Service Unit during water year 1991.

  4. Development of an ultra high performance liquid chromatography method for determining triamcinolone acetonide in hydrogels using the design of experiments/design space strategy in combination with process capability index.

    PubMed

    Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías

    2016-07-01

    An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Engineering fluidic delays in paper-based devices using laser direct-writing.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2015-10-21

    We report the use of a new laser-based direct-write technique that allows programmable and timed fluid delivery in channels within a paper substrate which enables implementation of multi-step analytical assays. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depth and/or the porosity of hydrophobic barriers which, when fabricated in the fluid path, produce controllable fluid delay. We have patterned these flow delaying barriers at pre-defined locations in the fluidic channels using either a continuous wave laser at 405 nm, or a pulsed laser operating at 266 nm. Using this delay patterning protocol we generated flow delays spanning from a few minutes to over half an hour. Since the channels and flow delay barriers can be written via a common laser-writing process, this is a distinct improvement over other methods that require specialist operating environments, or custom-designed equipment. This technique can therefore be used for rapid fabrication of paper-based microfluidic devices that can perform single or multistep analytical assays.

  6. National survey on the pre-analytical variability in a representative cohort of Italian laboratories.

    PubMed

    Lippi, Giuseppe; Montagnana, Martina; Giavarina, Davide

    2006-01-01

    Owing to remarkable advances in automation, laboratory technology and informatics, the pre-analytical phase has become the major source of variability in laboratory testing. The present survey investigated the development of several pre-analytical processes within a representative cohort of Italian clinical laboratories. A seven-point questionnaire was designed to investigate the following issues: 1a) the mean outpatient waiting time before check-in and 1b) the mean time from check-in to sample collection; 2) the mean time from sample collection to analysis; 3) the type of specimen collected for clinical chemistry testing; 4) the degree of pre-analytical automation; 5a) the number of samples shipped to other laboratories and 5b) the availability of standardised protocols for transportation; 6) the conditions for specimen storage; and 7) the availability and type of guidelines for management of unsuitable specimens. The questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. 107 questionnaires (71.3%) were returned. Data analysis revealed a high degree of variability among laboratories for the time required for check-in, outpatient sampling, sample transportation to the referral laboratory and analysis upon the arrival. Only 31% of laboratories have automated some pre-analytical steps. Of the 87% of laboratories that ship specimens to other facilities without sample preparation, 19% have no standardised protocol for transportation. For conventional clinical chemistry testing, 74% of the laboratories use serum evacuated tubes (59% with and 15% without serum separator), whereas the remaining 26% use lithium-heparin evacuated tubes (11% with and 15% without plasma separator). The storage period and conditions for rerun/retest vary widely. Only 63% of laboratories have a codified procedure for the management of unsuitable specimens, which are recognised by visual inspection (69%) or automatic detection (29%). Only 56% of the laboratories have standardised procedures for the management of unsuitable specimens, which vary widely on a local basis. The survey highlights broad heterogeneity in several pre-analytical processes among Italian laboratories. The lack of reliable guidelines encompassing evidence-based practice is a major problem for the standardisation of this crucial part of the testing process and represents a major challenge for laboratory medicine in the 2000s.

  7. Quality assurance in the pre-analytical phase of human urine samples by (1)H NMR spectroscopy.

    PubMed

    Budde, Kathrin; Gök, Ömer-Necmi; Pietzner, Maik; Meisinger, Christine; Leitzmann, Michael; Nauck, Matthias; Köttgen, Anna; Friedrich, Nele

    2016-01-01

    Metabolomic approaches investigate changes in metabolite profiles, which may reflect changes in metabolic pathways and provide information correlated with a specific biological process or pathophysiology. High-resolution (1)H NMR spectroscopy is used to identify metabolites in biofluids and tissue samples qualitatively and quantitatively. This pre-analytical study evaluated the effects of storage time and temperature on (1)H NMR spectra from human urine in two settings. Firstly, to evaluate short time effects probably due to acute delay in sample handling and secondly, the effect of prolonged storage up to one month to find markers of sample miss-handling. A number of statistical procedures were used to assess the differences between samples stored under different conditions, including Projection to Latent Structure Discriminant Analysis (PLS-DA), non-parametric testing as well as mixed effect linear regression analysis. The results indicate that human urine samples can be stored at 10 °C for 24 h or at -80 °C for 1 month, as no relevant changes in (1)H NMR fingerprints were observed during these time periods and temperature conditions. However, some metabolites most likely of microbial origin showed alterations during prolonged storage but without facilitating classification. In conclusion, the presented protocol for urine sample handling and semi-automatic metabolite quantification is suitable for large-scale epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Need for gender-specific pre-analytical testing: the dark side of the moon in laboratory testing.

    PubMed

    Franconi, Flavia; Rosano, Giuseppe; Campesi, Ilaria

    2015-01-20

    Many international organisations encourage studies in a sex-gender perspective. However, research with a gender perspective presents a high degree of complexity, and the inclusion of sex-gender variable in experiments presents many methodological questions, the majority of which are still neglected. Overcoming these issues is fundamental to avoid erroneous results. Here, pre-analytical aspects of the research, such as study design, choice of utilised specimens, sample collection and processing, animal models of diseases, and the observer's role, are discussed. Artefacts in this stage of research could affect the predictive value of all analyses. Furthermore, the standardisation of research subjects according to their lifestyles and, if female, to their life phase and menses or oestrous cycle, is urgent to harmonise research worldwide. A sex-gender-specific attention to pre-analytical aspects could produce a decrease in the time for translation from the bench to bedside. Furthermore, sex-gender-specific pre-clinical pharmacological testing will enable adequate assessment of pharmacokinetic and pharmacodynamic actions of drugs and will enable, where appropriate, an adequate gender-specific clinical development plan. Therefore, sex-gender-specific pre-clinical research will increase the gender equity of care and will produce more evidence-based medicine. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. A zone-specific fish-based biotic index as a management tool for the Zeeschelde estuary (Belgium).

    PubMed

    Breine, Jan; Quataert, Paul; Stevens, Maarten; Ollevier, Frans; Volckaert, Filip A M; Van den Bergh, Ericia; Maes, Joachim

    2010-07-01

    Fish-based indices monitor changes in surface waters and are a valuable aid in communication by summarising complex information about the environment (Harrison and Whitfield, 2004). A zone-specific fish-based multimetric estuarine index of biotic integrity (Z-EBI) was developed based on a 13 year time series of fish surveys from the Zeeschelde estuary (Belgium). Sites were pre-classified using indicators of anthropogenic impact. Metrics showing a monotone response with pressure classes were selected for further analysis. Thresholds for the good ecological potential (GEP) were defined from references. A modified trisection was applied for the other thresholds. The Z-EBI is defined by the average of the metric scores calculated over a one year period and translated into an ecological quality ratio (EQR). The indices integrate structural and functional qualities of the estuarine fish communities. The Z-EBI performances were successfully validated for habitat degradation in the various habitat zones. Copyright 2010 Elsevier Ltd. All rights reserved.

  10. Droplet digital PCR-based EGFR mutation detection with an internal quality control index to determine the quality of DNA.

    PubMed

    Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee

    2018-01-11

    In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.

  11. 6S Return Samples: Assessment of Air Quality in the International Space Station (ISS) Based on Solid Sorbent Air Sampler (SSAS) and Formaldehyde Monitoring Kit (FMK) Analyses

    NASA Technical Reports Server (NTRS)

    James, John T.

    2004-01-01

    The toxicological assessments of SSAS and FMK analytical results are reported. Analytical methods have not changed from earlier reports. Surrogate standard recoveries from the SSAS tubes were 66-76% for 13C-acetone, 85-96% for fluorobenzene, and 73-89% for chlorobenzene. Post-flight flows were far below pre-flight flows and an investigation of the problem revealed that the reduced flow was caused by a leak at the interface of the pump inlet tube and the pump head. This resulted in degradation of pump efficiency. Further investigation showed that the problem occurred before the SSAS was operated on orbit and that use of the post-flight flows yielded consistent and useful results. Recoveries from formaldehyde control badges were 86 to 104%. The two general criteria used to assess air quality are the total-non-methane-volatile organic hydrocarbons (NMVOCs) and the total T-value (minus the CO2 and formaldehyde contributions). The T values will not be reported for these data due to the flow anomaly. Control of atmospheric alcohols is important to the water recovery system engineers, hence total alcohols (including acetone) are also shown for each sample. Octafluoropropane (OFP) is not efficiently trapped by the sorbents used in the SSAS. Because formaldehyde is quantified from sorbent badges, its concentration is also listed separately. These five indices of air quality are summarized.

  12. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    PubMed

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.

  13. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Contamination of dried blood spots - an underestimated risk in newborn screening.

    PubMed

    Winter, Theresa; Lange, Anja; Hannemann, Anke; Nauck, Matthias; Müller, Cornelia

    2018-01-26

    Newborn screening (NBS) is an established screening procedure in many countries worldwide, aiming at the early detection of inborn errors of metabolism. For decades, dried blood spots have been the standard specimen for NBS. The procedure of blood collection is well described and standardized and includes many critical pre-analytical steps. We examined the impact of contamination of some anticipated common substances on NBS results obtained from dry spot samples. This possible pre-analytical source of uncertainty has been poorly examined in the past. Capillary blood was obtained from 15 adult volunteers and applied to 10 screening filter papers per volunteer. Nine filter papers were contaminated without visible trace. The contaminants were baby diaper rash cream, baby wet wipes, disinfectant, liquid infant formula, liquid infant formula hypoallergenic (HA), ultrasonic gel, breast milk, feces, and urine. The differences between control and contaminated samples were evaluated for 45 NBS quantities. We estimated if the contaminations might lead to false-positive NBS results. Eight of nine investigated contaminants significantly altered NBS analyte concentrations and potentially caused false-positive screening outcomes. A contamination with feces was most influential, affecting 24 of 45 tested analytes followed by liquid infant formula (HA) and urine, affecting 19 and 13 of 45 analytes, respectively. A contamination of filter paper samples can have a substantial effect on the NBS results. Our results underline the importance of good pre-analytical training to make the staff aware of the threat and ensure reliable screening results.

  15. Sigma metrics used to assess analytical quality of clinical chemistry assays: importance of the allowable total error (TEa) target.

    PubMed

    Hens, Koen; Berth, Mario; Armbruster, Dave; Westgard, Sten

    2014-07-01

    Six Sigma metrics were used to assess the analytical quality of automated clinical chemistry and immunoassay tests in a large Belgian clinical laboratory and to explore the importance of the source used for estimation of the allowable total error. Clinical laboratories are continually challenged to maintain analytical quality. However, it is difficult to measure assay quality objectively and quantitatively. The Sigma metric is a single number that estimates quality based on the traditional parameters used in the clinical laboratory: allowable total error (TEa), precision and bias. In this study, Sigma metrics were calculated for 41 clinical chemistry assays for serum and urine on five ARCHITECT c16000 chemistry analyzers. Controls at two analyte concentrations were tested and Sigma metrics were calculated using three different TEa targets (Ricos biological variability, CLIA, and RiliBÄK). Sigma metrics varied with analyte concentration, the TEa target, and between/among analyzers. Sigma values identified those assays that are analytically robust and require minimal quality control rules and those that exhibit more variability and require more complex rules. The analyzer to analyzer variability was assessed on the basis of Sigma metrics. Six Sigma is a more efficient way to control quality, but the lack of TEa targets for many analytes and the sometimes inconsistent TEa targets from different sources are important variables for the interpretation and the application of Sigma metrics in a routine clinical laboratory. Sigma metrics are a valuable means of comparing the analytical quality of two or more analyzers to ensure the comparability of patient test results.

  16. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Budget Impact of a Comprehensive Nutrition-Focused Quality Improvement Program for Malnourished Hospitalized Patients.

    PubMed

    Sulo, Suela; Feldstein, Josh; Partridge, Jamie; Schwander, Bjoern; Sriram, Krishnan; Summerfelt, Wm Thomas

    2017-07-01

    Nutrition interventions can alleviate the burden of malnutrition by improving patient outcomes; however, evidence on the economic impact of medical nutrition intervention remains limited. A previously published nutrition-focused quality improvement program targeting malnourished hospitalized patients showed that screening patients with a validated screening tool at admission, rapidly administering oral nutritional supplements, and educating patients on supplement adherence result in significant reductions in 30-day unplanned readmissions and hospital length of stay. To assess the potential cost-savings associated with decreased 30-day readmissions and hospital length of stay in malnourished inpatients through a nutrition-focused quality improvement program using a web-based budget impact model, and to demonstrate the clinical and fiscal value of the intervention. The reduction in readmission rate and length of stay for 1269 patients enrolled in the quality improvement program (between October 13, 2014, and April 2, 2015) were compared with the pre-quality improvement program baseline and validation cohorts (4611 patients vs 1319 patients, respectively) to calculate potential cost-savings as well as to inform the design of the budget impact model. Readmission rate and length-of-stay reductions were calculated by determining the change from baseline to post-quality improvement program as well as the difference between the validation cohort and the post-quality improvement program, respectively. As a result of improved health outcomes for the treated patients, the nutrition-focused quality improvement program led to a reduction in 30-day hospital readmissions and length of stay. The avoided hospital readmissions and reduced number of days in the hospital for the patients in the quality improvement program resulted in cost-savings of $1,902,933 versus the pre-quality improvement program baseline cohort, and $4,896,758 versus the pre-quality improvement program in the validation cohort. When these costs were assessed across the entire patient population enrolled in the quality improvement program, per-patient net savings of $1499 when using the baseline cohort as the comparator and savings per patient treated of $3858 when using the validated cohort as the comparator were achieved. The nutrition-focused quality improvement program reduced the per-patient healthcare costs by avoiding 30-day readmissions and through reduced length of hospital stay. These clinical and economic outcomes provide a rationale for merging patient care and financial modeling to advance the delivery of value-based medicine in a malnourished hospitalized population. The use of a novel web-based budget impact model supports the integration of comparative effectiveness analytics and healthcare resource management in the hospital setting to provide optimal quality of care at a reduced overall cost.

  19. Influence of alkaline hydrogen peroxide pre-hydrolysis on the isolation of microcrystalline cellulose from oil palm fronds.

    PubMed

    Owolabi, Abdulwahab F; Haafiz, M K Mohamad; Hossain, Md Sohrab; Hussin, M Hazwan; Fazita, M R Nurul

    2017-02-01

    In the present study, microcrystalline cellulose (MCC) was isolated from oil palm fronds (OPF) using chemo-mechanical process. Wherein, alkaline hydrogen peroxide (AHP) was utilized to extract OPF fibre at different AHP concentrations. The OPF pulp fibre was then bleached with acidified sodium chlorite solution followed by the acid hydrolysis using hydrochloric acid. Several analytical methods were conducted to determine the influence of AHP concentration on thermal properties, morphological properties, microscopic and crystalline behaviour of isolated MCC. Results showed that the MCC extracted from OPF fibres had fibre diameters of 7.55-9.11nm. X-ray diffraction (XRD) analyses revealed that the obtained microcrystalline fibre had both celluloses I and cellulose II polymorphs structure, depending on the AHP concentrations. The Fourier transmission infrared (FTIR) analyses showed that the AHP pre-hydrolysis was successfully removed hemicelluloses and lignin from the OPF fibre. The crystallinity of the MCC was increased with the AHP concentrations. The degradation temperature of MCC was about 300°C. The finding of the present study showed that pre-treatment process potentially influenced the quality of the isolation of MCC from oil palm fronds. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. High Throughput Detection of Tetracycline Residues in Milk Using Graphene or Graphene Oxide as MALDI-TOF MS Matrix

    NASA Astrophysics Data System (ADS)

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    In this work, a new pre-analysis method for tetracyclines (TCs) detection from the milk samples was established. As a good accomplishment for the existing accurate quantification strategies for TCs detection, the new pre-analysis method was demonstrated to be simple, sensitive, fast, cost effective, and high throughput, which would do a great favor to the routine quality pre-analysis of TCs from milk samples. Graphene or graphene oxide was utilized, for the first time, as a duel-platform to enrich and detect the TCs by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). All together, four TCs were chosen as models: tetracycline, oxytetracycline, demeclocycline, and chlortetracycline. Due to the excellent electronic, thermal, and mechanical properties, graphene and graphene oxide were successfully applied as matrices for MALDI-TOF MS with free background inference in low mass range. Meanwhile, graphene or graphene oxide has a large surface area and strong interaction force with the analytes. By taking the advantage of these features, TCs were effectively enriched with the limit of detection (LOD) as low as 2 nM.

  1. Assessment of analytical quality in Nordic clinical chemistry laboratories using data from contemporary national programs.

    PubMed

    Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H

    1984-01-01

    The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.

  2. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    PubMed

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  3. MALDI-TOF MS identification of anaerobic bacteria: assessment of pre-analytical variables and specimen preparation techniques.

    PubMed

    Hsu, Yen-Michael S; Burnham, Carey-Ann D

    2014-06-01

    Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a tool for identifying clinically relevant anaerobes. We evaluated the analytical performance characteristics of the Bruker Microflex with Biotyper 3.0 software system for identification of anaerobes and examined the impact of direct formic acid (FA) treatment and other pre-analytical factors on MALDI-TOF MS performance. A collection of 101 anaerobic bacteria were evaluated, including Clostridium spp., Propionibacterium spp., Fusobacterium spp., Bacteroides spp., and other anaerobic bacterial of clinical relevance. The results of our study indicate that an on-target extraction with 100% FA improves the rate of accurate identification without introducing misidentification (P<0.05). In addition, we modify the reporting cutoffs for the Biotyper "score" yielding acceptable identification. We found that a score of ≥1.700 can maximize the rate of identification. Of interest, MALDI-TOF MS can correctly identify anaerobes grown in suboptimal conditions, such as on selective culture media and following oxygen exposure. In conclusion, we report on a number of simple and cost-effective pre- and post-analytical modifications could enhance MALDI-TOF MS identification for anaerobic bacteria. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Isotope Inversion Experiment evaluating the suitability of calibration in surrogate matrix for quantification via LC-MS/MS-Exemplary application for a steroid multi-method.

    PubMed

    Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H

    2016-05-30

    For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Forensic entomology: implementing quality assurance for expertise work.

    PubMed

    Gaudry, Emmanuel; Dourel, Laurent

    2013-09-01

    The Department of Forensic Entomology (Institut de Recherche Criminelle de la Gendarmerie Nationale, France) was accredited by the French Committee of Accreditation (Cofrac's Healthcare section) in October 2007 on the basis of NF EN ISO/CEI 17025 standard. It was the first accreditation in this specific field of forensic sciences in France and in Europe. The present paper introduces the accreditation process in forensic entomology (FE) through the experience of the Department of Forensic Entomology. Based upon the identification of necrophagous insects and the study of their biology, FE must, as any other expertise work in forensic sciences, demonstrate integrity and good working practice to satisfy both the courts and the scientific community. FE does not, strictly speaking, follow an analytical method. This could explain why, to make up for a lack of appropriate quality reference, a specific documentation was drafted and written by the staff of the Department of Forensic Entomology in order to define working methods complying with quality standards (testing methods). A quality assurance system is laborious to set up and maintain and can be perceived as complex, time-consuming and never-ending. However, a survey performed in 2011 revealed that the accreditation process in the frame of expertise work has led to new well-defined working habits, based on an effort at transparency. It also requires constant questioning and a proactive approach, both profitable for customers (magistrates, investigators) and analysts (forensic entomologists).

  6. Stability of Routine Biochemical Analytes in Whole Blood and Plasma From Lithium Heparin Gel Tubes During 6-hr Storage.

    PubMed

    Monneret, Denis; Godmer, Alexandre; Le Guen, Ronan; Bravetti, Clotilde; Emeraud, Cecile; Marteau, Anthony; Alkouri, Rana; Mestari, Fouzi; Dever, Sylvie; Imbert-Bismut, Françoise; Bonnefont-Rousselot, Dominique

    2016-09-01

    The stability of biochemical analytes has already been investigated, but results strongly differ depending on parameters, methodologies, and sample storage times. We investigated the stability for many biochemical parameters after different storage times of both whole blood and plasma, in order to define acceptable pre- and postcentrifugation delays in hospital laboratories. Twenty-four analytes were measured (Modular® Roche analyzer) in plasma obtained from blood collected into lithium heparin gel tubes, after 2-6 hr of storage at room temperature either before (n = 28: stability in whole blood) or after (n = 21: stability in plasma) centrifugation. Variations in concentrations were expressed as mean bias from baseline, using the analytical change limit (ACL%) or the reference change value (RCV%) as acceptance limit. In tubes stored before centrifugation, mean plasma concentrations significantly decreased after 3 hr for phosphorus (-6.1% [95% CI: -7.4 to -4.7%]; ACL 4.62%) and lactate dehydrogenase (LDH; -5.7% [95% CI: -7.4 to -4.1%]; ACL 5.17%), and slightly decreased after 6 hr for potassium (-2.9% [95% CI: -5.3 to -0.5%]; ACL 4.13%). In plasma stored after centrifugation, mean concentrations decreased after 6 hr for bicarbonates (-19.7% [95% CI: -22.9 to -16.5%]; ACL 15.4%), and moderately increased after 4 hr for LDH (+6.0% [95% CI: +4.3 to +7.6%]; ACL 5.17%). Based on RCV, all the analytes can be considered stable up to 6 hr, whether before or after centrifugation. This study proposes acceptable delays for most biochemical tests on lithium heparin gel tubes arriving at the laboratory or needing to be reanalyzed. © 2016 Wiley Periodicals, Inc.

  7. The Relationship between Internal Teacher Profiles and the Quality of Teacher-Child Interactions in Prekindergarten

    ERIC Educational Resources Information Center

    Decker-Woodrow, Lauren

    2018-01-01

    This study investigates the relationship between internal teacher profiles and pre-K teacher-child interaction quality in the pre-K classroom. Two questions were addressed: (1) What internal profiles exist for pre-kindergarten (pre-K) teachers? and (2) Do internal profiles relate to observed structural and process quality in the pre-K classroom?…

  8. The effect of pre-transplant pain and chronic disease self-efficacy on quality of life domains in the year following hematopoietic stem cell transplantation.

    PubMed

    O'Sullivan, Madeline L; Shelby, Rebecca A; Dorfman, Caroline S; Kelleher, Sarah A; Fisher, Hannah M; Rowe Nichols, Krista A; Keefe, Francis J; Sung, Anthony D; Somers, Tamara J

    2018-04-01

    Pain is common for hematopoietic stem cell transplant (HSCT) patients and may be experienced pre-transplant, acutely post-transplant, and for months or years following transplant. HSCT patients with persistent pain may be at risk for poor quality of life following transplant; however, the impact of pre-transplant pain on quality of life post-transplant is not well understood. Self-efficacy for chronic disease management is associated with quality of life among cancer patients and may impact quality of life for HSCT patients. The primary aim was to examine the effect of pre-transplant pain and self-efficacy on quality of life domains in the year following transplant. One hundred sixty-six HSCT patients completed questionnaires providing information on pain, self-efficacy, and quality of life prior to transplant, at discharge, and 3-, 6-, and 12-months post-transplant as part of a longitudinal, observational study. Linear mixed modeling examined the trajectories of these variables and the effect of pre-transplant pain and self-efficacy on post-transplant quality of life. Pain and social and emotional quality of life remained stable in the year following transplant while self-efficacy and physical and functional quality of life improved. Pre-transplant pain was significantly related to lower physical well-being post-transplant. Lower pre-transplant self-efficacy was related to lower quality of life across all domains post-transplant. Above and beyond the effect of pre-transplant pain, self-efficacy for managing chronic disease is important in understanding quality of life following transplant. Identifying patients with pain and/or low self-efficacy pre-transplant may allow for early intervention with self-management strategies.

  9. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  10. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  11. Methodology to define biological reference values in the environmental and occupational fields: the contribution of the Italian Society for Reference Values (SIVR).

    PubMed

    Aprea, Maria Cristina; Scapellato, Maria Luisa; Valsania, Maria Carmen; Perico, Andrea; Perbellini, Luigi; Ricossa, Maria Cristina; Pradella, Marco; Negri, Sara; Iavicoli, Ivo; Lovreglio, Piero; Salamon, Fabiola; Bettinelli, Maurizio; Apostoli, Pietro

    2017-04-21

    Biological reference values (RVs) explore the relationships between humans and their environment and habits. RVs are fundamental in the environmental field for assessing illnesses possibly associated with environmental pollution, and also in the occupational field, especially in the absence of established biological or environmental limits. The Italian Society for Reference Values (SIVR) determined to test criteria and procedures for the definition of RVs to be used in the environmental and occupational fields. The paper describes the SIVR methodology for defining RVs of xenobiotics and their metabolites. Aspects regarding the choice of population sample, the quality of analytical data, statistical analysis and control of variability factors are considered. The simultaneous interlaboratory circuits involved can be expected to increasingly improve the quality of the analytical data. Examples of RVs produced by SIVR are presented. In particular, levels of chromium, mercury, ethylenethiourea, 3,5,6-trichloro-2-pyridinol, 2,5-hexanedione, 1-hydroxypyrene and t,t-muconic acid measured in urine and expressed in micrograms/g creatinine (μg/g creat) or micrograms/L (μg/L) are reported. With the proposed procedure, SIVR intends to make its activities known to the scientific community in order to increase the number of laboratories involved in the definition of RVs for the Italian population. More research is needed to obtain further RVs in different biological matrices, such as hair, nails and exhaled breath. It is also necessary to update and improve the present reference values and broaden the portfolio of chemicals for which RVs are available. In the near future, SIVR intends to expand its scientific activity by using a multivariate approach for xenobiotics that may have a common origin, and to define RVs separately for children who may be exposed more than adults and be more vulnerable.

  12. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  13. Analyte separation utilizing temperature programmed desorption of a preconcentrator mesh

    DOEpatents

    Linker, Kevin L.; Bouchier, Frank A.; Theisen, Lisa; Arakaki, Lester H.

    2007-11-27

    A method and system for controllably releasing contaminants from a contaminated porous metallic mesh by thermally desorbing and releasing a selected subset of contaminants from a contaminated mesh by rapidly raising the mesh to a pre-determined temperature step or plateau that has been chosen beforehand to preferentially desorb a particular chemical specie of interest, but not others. By providing a sufficiently long delay or dwell period in-between heating pulses, and by selecting the optimum plateau temperatures, then different contaminant species can be controllably released in well-defined batches at different times to a chemical detector in gaseous communication with the mesh. For some detectors, such as an Ion Mobility Spectrometer (IMS), separating different species in time before they enter the IMS allows the detector to have an enhanced selectivity.

  14. Design, fabrication and test of graphite/epoxy metering truss structure components, phase 3

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design, materials, tooling, manufacturing processes, quality control, test procedures, and results associated with the fabrication and test of graphite/epoxy metering truss structure components exhibiting a near zero coefficient of thermal expansion are described. Analytical methods were utilized, with the aid of a computer program, to define the most efficient laminate configurations in terms of thermal behavior and structural requirements. This was followed by an extensive material characterization and selection program, conducted for several graphite/graphite/hybrid laminate systems to obtain experimental data in support of the analytical predictions. Mechanical property tests as well as the coefficient of thermal expansion tests were run on each laminate under study, the results of which were used as the selection criteria for the single most promising laminate. Further coefficient of thermal expansion measurement was successfully performed on three subcomponent tubes utilizing the selected laminate.

  15. The "hospital central laboratory": automation, integration and clinical usefulness.

    PubMed

    Zaninotto, Martina; Plebani, Mario

    2010-07-01

    Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.

  16. Accelerated development and flight evaluation of active controls concepts for subsonic transport aircraft. Volume 2: AFT C.G. simulation and analysis

    NASA Technical Reports Server (NTRS)

    Urie, D. M.

    1979-01-01

    Relaxed static stability and stability augmentation with active controls were investigated for subsonic transport aircraft. Analytical and simulator evaluations were done using a contemporary wide body transport as a baseline. Criteria for augmentation system performance and unaugmented flying qualities were evaluated. Augmentation control laws were defined based on selected frequency response and time history criteria. Flying qualities evaluations were conducted by pilots using a moving base simulator with a transport cab. Static margin and air turbulence intensity were varied in test with and without augmentation. Suitability of a simple pitch control law was verified at neutral static margin in cruise and landing flight tasks. Neutral stability was found to be marginally acceptable in heavy turbulence in both cruise and landing conditions.

  17. Variability in, variability out: best practice recommendations to standardize pre-analytical variables in the detection of circulating and tissue microRNAs.

    PubMed

    Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M

    2017-05-01

    microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.

  18. [Technical recommendations and best practice guidelines for May-Grünwald-Giemsa staining: literature review and insights from the quality assurance].

    PubMed

    Piaton, Eric; Fabre, Monique; Goubin-Versini, Isabelle; Bretz-Grenier, Marie-Françoise; Courtade-Saïdi, Monique; Vincent, Serge; Belleannée, Geneviève; Thivolet, Françoise; Boutonnat, Jean; Debaque, Hervé; Fleury-Feith, Jocelyne; Vielh, Philippe; Cochand-Priollet, Béatrix; Egelé, Caroline; Bellocq, Jean-Pierre; Michiels, Jean-François

    2015-08-01

    May-Grünwald-Giemsa (MGG) stain is a Romanowsky-type, polychromatic stain as those of Giemsa, Leishman and Wright. Apart being the reference method of haematology, it has become a routine stain of diagnostic cytopathology for the study of air-dried preparations (lymph node imprints, centrifuged body fluids and fine needle aspirations). In the context of their actions of promoting the principles of quality assurance in cytopathology, the French Association for Quality Assurance in Anatomic and Cytologic Pathology (AFAQAP) and the French Society of Clinical Cytology (SFCC) conducted a proficiency test on MGG stain in 2013. Results from the test, together with the review of literature data allow pre-analytical and analytical steps of MGG stain to be updated. Recommendations include rapid air-drying of cell preparations/imprints, fixation using either methanol or May-Grünwald alone for 3-10minutes, two-step staining: 50% May-Grünwald in buffer pH 6.8 v/v for 3-5minutes, followed by 10% buffered Giemsa solution for 10-30minutes, and running water for 1-3minutes. Quality evaluation must be performed on red blood cells (RBCs) and leukocytes, not on tumour cells. Under correct pH conditions, RBCs must appear pink-orange (acidophilic) or buff-coloured, neither green nor blue. Leukocyte cytoplasm must be almost transparent, with clearly delineated granules. However, staining may vary somewhat and testing is recommended for automated methods (slide stainers) which remain the standard for reproducibility. Though MGG stain remains the reference stain, Diff-Quik(®) stain can be used for the rapid evaluation of cell samples. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  19. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Digital forensics: an analytical crime scene procedure model (ACSPM).

    PubMed

    Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut

    2013-12-10

    In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic investigations each case is unique and needs special examination, it is not possible to cover every aspect of crime scene digital forensics, but the proposed procedure model is supposed to be a general guideline for practitioners. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. An Analytical Model for Assessing Stability of Pre-Existing Faults in Caprock Caused by Fluid Injection and Extraction in a Reservoir

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Bai, Bing; Li, Xiaochun; Liu, Mingze; Wu, Haiqing; Hu, Shaobin

    2016-07-01

    Induced seismicity and fault reactivation associated with fluid injection and depletion were reported in hydrocarbon, geothermal, and waste fluid injection fields worldwide. Here, we establish an analytical model to assess fault reactivation surrounding a reservoir during fluid injection and extraction that considers the stress concentrations at the fault tips and the effects of fault length. In this model, induced stress analysis in a full-space under the plane strain condition is implemented based on Eshelby's theory of inclusions in terms of a homogeneous, isotropic, and poroelastic medium. The stress intensity factor concept in linear elastic fracture mechanics is adopted as an instability criterion for pre-existing faults in surrounding rocks. To characterize the fault reactivation caused by fluid injection and extraction, we define a new index, the "fault reactivation factor" η, which can be interpreted as an index of fault stability in response to fluid pressure changes per unit within a reservoir resulting from injection or extraction. The critical fluid pressure change within a reservoir is also determined by the superposition principle using the in situ stress surrounding a fault. Our parameter sensitivity analyses show that the fault reactivation tendency is strongly sensitive to fault location, fault length, fault dip angle, and Poisson's ratio of the surrounding rock. Our case study demonstrates that the proposed model focuses on the mechanical behavior of the whole fault, unlike the conventional methodologies. The proposed method can be applied to engineering cases related to injection and depletion within a reservoir owing to its efficient computational codes implementation.

  2. Some comments on mapping from disease-specific to generic health-related quality-of-life scales.

    PubMed

    Palta, Mari

    2013-01-01

    An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error-prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Algebraic approach to small-world network models

    NASA Astrophysics Data System (ADS)

    Rudolph-Lilith, Michelle; Muller, Lyle E.

    2014-01-01

    We introduce an analytic model for directed Watts-Strogatz small-world graphs and deduce an algebraic expression of its defining adjacency matrix. The latter is then used to calculate the small-world digraph's asymmetry index and clustering coefficient in an analytically exact fashion, valid nonasymptotically for all graph sizes. The proposed approach is general and can be applied to all algebraically well-defined graph-theoretical measures, thus allowing for an analytical investigation of finite-size small-world graphs.

  4. Mind the gaps - the epidemiology of poor-quality anti-malarials in the malarious world - analysis of the WorldWide Antimalarial Resistance Network database

    PubMed Central

    2014-01-01

    Background Poor quality medicines threaten the lives of millions of patients and are alarmingly common in many parts of the world. Nevertheless, the global extent of the problem remains unknown. Accurate estimates of the epidemiology of poor quality medicines are sparse and are influenced by sampling methodology and diverse chemical analysis techniques. In order to understand the existing data, the Antimalarial Quality Scientific Group at WWARN built a comprehensive, open-access, global database and linked Antimalarial Quality Surveyor, an online visualization tool. Analysis of the database is described here, the limitations of the studies and data reported, and their public health implications discussed. Methods The database collates customized summaries of 251 published anti-malarial quality reports in English, French and Spanish by time and location since 1946. It also includes information on assays to determine quality, sampling and medicine regulation. Results No publicly available reports for 60.6% (63) of the 104 malaria-endemic countries were found. Out of 9,348 anti-malarials sampled, 30.1% (2,813) failed chemical/packaging quality tests with 39.3% classified as falsified, 2.3% as substandard and 58.3% as poor quality without evidence available to categorize them as either substandard or falsified. Only 32.3% of the reports explicitly described their definitions of medicine quality and just 9.1% (855) of the samples collected in 4.6% (six) surveys were conducted using random sampling techniques. Packaging analysis was only described in 21.5% of publications and up to twenty wrong active ingredients were found in falsified anti-malarials. Conclusions There are severe neglected problems with anti-malarial quality but there are important caveats to accurately estimate the prevalence and distribution of poor quality anti-malarials. The lack of reports in many malaria-endemic areas, inadequate sampling techniques and inadequate chemical analytical methods and instrumental procedures emphasizes the need to interpret medicine quality results with caution. The available evidence demonstrates the need for more investment to improve both sampling and analytical methodology and to achieve consensus in defining different types of poor quality medicines. PMID:24712972

  5. [Bone Cell Biology Assessed by Microscopic Approach. Assessment of bone quality using Raman and infrared spectroscopy].

    PubMed

    Suda, Hiromi Kimura

    2015-10-01

    Bone quality, which was defined as "the sum total of characteristics of the bone that influence the bone's resistance to fracture" at the National Institute of Health (NIH) conference in 2001, contributes to bone strength in combination with bone mass. Bone mass is often measured as bone mineral density (BMD) and, consequently, can be quantified easily. On the other hand, bone quality is composed of several factors such as bone structure, bone matrix, calcification degree, microdamage, and bone turnover, and it is not easy to obtain data for the various factors. Therefore, it is difficult to quantify bone quality. We are eager to develop new measurement methods for bone quality that make it possible to determine several factors associated with bone quality at the same time. Analytic methods based on Raman and FTIR spectroscopy have attracted a good deal of attention as they can provide a good deal of chemical information about hydroxyapatite and collagen, which are the main components of bone. A lot of studies on bone quality using Raman and FTIR imaging have been reported following the development of the two imaging systems. Thus, both Raman and FTIR imaging appear to be promising new bone morphometric techniques.

  6. Dementia training programmes for staff working in general hospital settings - a systematic review of the literature.

    PubMed

    Scerri, Anthony; Innes, Anthea; Scerri, Charles

    2017-08-01

    Although literature describing and evaluating training programmes in hospital settings increased in recent years, there are no reviews that summarise these programmes. This review sought to address this, by collecting the current evidence on dementia training programmes directed to staff working in general hospitals. Literature from five databases were searched, based on a number of inclusion criteria. The selected studies were summarised and data was extracted and compared using narrative synthesis based on a set of pre-defined categories. Methodological quality was assessed. Fourteen peer-reviewed studies were identified with the majority being pre-test post-test investigations. No randomised controlled trials were found. Methodological quality was variable with selection bias being the major limitation. There was a great variability in the development and mode of delivery although, interdisciplinary ward based, tailor-made, short sessions using experiential and active learning were the most utilised. The majority of the studies mainly evaluated learning, with few studies evaluating changes in staff behaviour/practices and patients' outcomes. This review indicates that high quality studies are needed that especially evaluate staff behaviours and patient outcomes and their sustainability over time. It also highlights measures that could be used to develop and deliver training programmes in hospital settings.

  7. Realizing the promise of reverse phase protein arrays for clinical, translational, and basic research: a workshop report: the RPPA (Reverse Phase Protein Array) society.

    PubMed

    Akbani, Rehan; Becker, Karl-Friedrich; Carragher, Neil; Goldstein, Ted; de Koning, Leanne; Korf, Ulrike; Liotta, Lance; Mills, Gordon B; Nishizuka, Satoshi S; Pawlak, Michael; Petricoin, Emanuel F; Pollard, Harvey B; Serrels, Bryan; Zhu, Jingchun

    2014-07-01

    Reverse phase protein array (RPPA) technology introduced a miniaturized "antigen-down" or "dot-blot" immunoassay suitable for quantifying the relative, semi-quantitative or quantitative (if a well-accepted reference standard exists) abundance of total protein levels and post-translational modifications across a variety of biological samples including cultured cells, tissues, and body fluids. The recent evolution of RPPA combined with more sophisticated sample handling, optical detection, quality control, and better quality affinity reagents provides exquisite sensitivity and high sample throughput at a reasonable cost per sample. This facilitates large-scale multiplex analysis of multiple post-translational markers across samples from in vitro, preclinical, or clinical samples. The technical power of RPPA is stimulating the application and widespread adoption of RPPA methods within academic, clinical, and industrial research laboratories. Advances in RPPA technology now offer scientists the opportunity to quantify protein analytes with high precision, sensitivity, throughput, and robustness. As a result, adopters of RPPA technology have recognized critical success factors for useful and maximum exploitation of RPPA technologies, including the following: preservation and optimization of pre-analytical sample quality, application of validated high-affinity and specific antibody (or other protein affinity) detection reagents, dedicated informatics solutions to ensure accurate and robust quantification of protein analytes, and quality-assured procedures and data analysis workflows compatible with application within regulated clinical environments. In 2011, 2012, and 2013, the first three Global RPPA workshops were held in the United States, Europe, and Japan, respectively. These workshops provided an opportunity for RPPA laboratories, vendors, and users to share and discuss results, the latest technology platforms, best practices, and future challenges and opportunities. The outcomes of the workshops included a number of key opportunities to advance the RPPA field and provide added benefit to existing and future participants in the RPPA research community. The purpose of this report is to share and disseminate, as a community, current knowledge and future directions of the RPPA technology. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. [Effect strength variation in the single group pre-post study design: a critical review].

    PubMed

    Maier-Riehle, B; Zwingmann, C

    2000-08-01

    In Germany, studies in rehabilitation research--in particular evaluation studies and examinations of quality of outcome--have so far mostly been executed according to the uncontrolled one-group pre-post design. Assessment of outcome is usually made by comparing the pre- and post-treatment means of the outcome variables. The pre-post differences are checked, and in case of significance, the results are increasingly presented in form of effect sizes. For this reason, this contribution presents different effect size indices used for the one-group pre-post design--in spite of fundamental doubts which exist in relation to that design due to its limited internal validity. The numerator concerning all effect size indices of the one-group pre-post design is defined as difference between the pre- and post-treatment means, whereas there are different possibilities and recommendations with regard to the denominator and hence the standard deviation that serves as the basis for standardizing the difference of the means. Used above all are standardization oriented towards the standard deviation of the pre-treatment scores, standardization oriented towards the pooled standard deviation of the pre- and post-treatment scores, and standardization oriented towards the standard deviation of the pre-post differences. Two examples are given to demonstrate that the different modes of calculating effect size indices in the one-group pre-post design may lead to very different outcome patterns. Additionally, it is pointed out that effect sizes from the uncontrolled one-group pre-post design generally tend to be higher than effect sizes from studies conducted with control groups. Finally, the pros and cons of the different effect size indices are discussed and recommendations are given.

  9. Evaluation of Cobas Integra 800 under simulated routine conditions in six laboratories.

    PubMed

    Redondo, Francisco L; Bermudez, Pilar; Cocco, Claudio; Colella, Francesca; Graziani, Maria Stella; Fiehn, Walter; Hierla, Thomas; Lemoël, Gisèle; Belliard, AnneMarie; Manene, Dieudonne; Meziani, Mourad; Liebel, Maryann; McQueen, Matthew J; Stockmann, Wolfgang

    2003-03-01

    The new selective access analyser Cobas Integra 800 from Roche Diagnostics was evaluated in an international multicentre study at six sites. Routine simulation experiments showed good performance and full functionality of the instrument and provocation of anomalous situations generated no problems. The new features on Cobas Integra 800, namely clot detection and dispensing control, worked according to specifications. The imprecision of Cobas Integra 800 fulfilled the proposed quality specifications regarding imprecision of analytical systems for clinical chemistry with few exceptions. Claims for linearity, drift, and carry-over were all within the defined specifications, except urea linearity. Interference exists in some cases, as could be expected due to the chemistries applied. Accuracy met the proposed quality specifications, except in some special cases. Method comparisons with Cobas Integra 700 showed good agreement; comparisons with other analysis systems yielded in several cases explicable deviations. Practicability of Cobas Integra 800 met or exceeded the requirements for more than 95% of all attributes rated. The strong points of the new analysis system were reagent handling, long stability of calibration curves, high number of tests on board, compatibility of the sample carrier to other Roche systems, and the sample integrity check for more reliable analytical results. The improvement of the workflow offered by the 5-position rack and STAT handling like on Cobas Integra 800 makes the instrument attractive for further consolidation in the medium-sized laboratory, for dedicated use of special analytes, and/or as back-up in the large routine laboratory.

  10. A Galerkin Approach to Define Measured Terrain Surfaces with Analytic Basis Vectors to Produce a Compact Representation

    DTIC Science & Technology

    2010-11-01

    defined herein as terrain whose surface deformation due to a single vehicle traversing the surface is negligible, such as paved roads (both asphalt ...ground vehicle reliability predictions. Current application of this work is limited to the analysis of U.S. Highways, comprised of both asphalt and...Highways that are consistent between asphalt and concrete roads b. The principle terrain characteristics are defined with analytic basis vectors

  11. Patient quality of life following induction of oral immunotherapy for food allergy.

    PubMed

    Epstein Rigbi, Na'ama; Katz, Yitzhak; Goldberg, Michael R; Levy, Michael B; Nachshon, Liat; Elizur, Arnon

    2016-05-01

    Patient quality of life improves following successful completion of oral immunotherapy (OIT), but the process itself might have undesirable effects. We aimed to evaluate patient quality of life following OIT initial induction. The Hebrew version of the Food Allergy Quality of Life Questionnaire-Parental Form (FAQLQ-PF) was validated and administered to the parents of children following the first week of OIT for food allergy (n = 119). Patient demographics and clinical history as well as the course of initial induction week were reviewed. Pre-OIT severity of food allergy, defined as severity of reactions due to accidental exposure to the allergenic food (anaphylactic reactions, p = 0.017; epinephrine use, p = 0.049; emergency room referrals p = 0.003; and hospital admissions, p = 0.015) and a lower number of tolerated doses during initial induction, reflective of a lower maximal tolerated dose for the different allergens (p = 0.011) were associated with worse total FAQLQ-PF scores. The number of tolerated doses during induction and pre-OIT emergency room referrals remained significantly associated with worse total score of the FAQLQ-PF on multivariate analysis (p = 0.016 and p = 0.005, respectively). The correlation between the number of tolerated doses and quality of life scores was moderate-strong primarily in children aged 6-12 years (Total score, r = -0.41, p = 0.001; Emotional Impact r = -0.42, p = 0.001; Food Anxiety, r = -0.38, p = 0.002; Social and Dietary Limitations, r = -0.33, p = 0.009). Pre-OIT reaction severity affects quality of life in both preschool and school-aged food-allergic children. In contrast, a lower maximal tolerated dose during OIT induction is associated with worse indices of quality of life primarily in children aged 6-12 years. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Changes in Lithuanian Pre-School and Pre-Primary Education Quality over the Last Decade

    ERIC Educational Resources Information Center

    Monkeviciene, Ona; Stankeviciene, Kristina

    2011-01-01

    Over the last decade, the changes in Lithuanian pre-school and pre-primary education have been predetermined by changes in paradigms of children's education and strategic education documents that provided for guidelines of high quality children's (self-)education, an increasing attention of society to the quality of children's education, training…

  13. Development and implementation of a quality improvement curriculum for child neurology residents: lessons learned.

    PubMed

    Maski, Kiran P; Loddenkemper, Tobias; An, Sookee; Allred, Elizabeth N; Urion, David K; Leviton, Alan

    2014-05-01

    Quality improvement is a major component of the Accreditation Council for Graduate Medical Education core competencies required of all medical trainees. Currently, neither the Neurology Residency Review Committee nor the Accreditation Council for Graduate Medical Education defines the process by which this competency should be taught and assessed. We developed a quality improvement curriculum that provides mentorship for resident quality improvement projects and is clinically relevant to pediatric neurologists. Before and after implementation of the quality improvement curriculum, a 14-item survey assessed resident comfort with quality improvement project skills and attitudes about implementation of quality improvement in clinical practice using a 5-point Likert scale. We used the Kruskal-Wallis and Fisher exact tests to evaluate pre to post changes. Residents' gained confidence in their abilities to identify measures (P = 0.02) and perform root cause analysis (P = 0.02). Overall, 73% of residents were satisfied or very satisfied with the quality improvement curriculum. Our child neurology quality improvement curriculum was well accepted by trainees. We report the details of this curriculum and its impact on residents and discuss its potential to meet the Accreditation Council for Graduate Medical Education's Next Accreditation System requirements. Published by Elsevier Inc.

  14. Meeting report: applied biopharmaceutics and quality by design for dissolution/release specification setting: product quality for patient benefit.

    PubMed

    Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi

    2010-09-01

    A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.

  15. The Efficacy of Problem-Based Learning in an Analytical Laboratory Course for Pre-Service Chemistry Teachers

    ERIC Educational Resources Information Center

    Yoon, Heojeong; Woo, Ae Ja; Treagust, David; Chandrasegaran, A. L.

    2014-01-01

    The efficacy of problem-based learning (PBL) in an analytical chemistry laboratory course was studied using a programme that was designed and implemented with 20 students in a treatment group over 10 weeks. Data from 26 students in a traditional analytical chemistry laboratory course were used for comparison. Differences in the creative thinking…

  16. Guidelines for May-Grünwald-Giemsa staining in haematology and non-gynaecological cytopathology: recommendations of the French Society of Clinical Cytology (SFCC) and of the French Association for Quality Assurance in Anatomic and Cytologic Pathology (AFAQAP).

    PubMed

    Piaton, E; Fabre, M; Goubin-Versini, I; Bretz-Grenier, M-F; Courtade-Saïdi, M; Vincent, S; Belleannée, G; Thivolet, F; Boutonnat, J; Debaque, H; Fleury-Feith, J; Vielh, P; Egelé, C; Bellocq, J-P; Michiels, J-F; Cochand-Priollet, B

    2016-10-01

    Since the guidelines of the International Committee for Standardisation in Haematology (ICSH) in 1984 and those of the European Committee for External Quality Assessment Programmes in Laboratory Medicine (EQALM) in 2004, no leading organisation has published technical recommendations for the preparation of air-dried cytological specimens using May-Grünwald-Giemsa (MGG) staining. Literature data were retrieved using reference books, baseline-published studies, articles extracted from PubMed/Medline and Google Scholar, and online-available industry datasheets. The present review addresses all pre-analytical issues concerning the use of Romanowsky's stains (including MGG) in haematology and non-gynaecological cytopathology. It aims at serving as actualised, best practice recommendations for the proper handling of air-dried cytological specimens. It, therefore, appears complementary to the staining criteria of the non-gynaecological diagnostic cytology handbook edited by the United Kingdom National External Quality Assessment Service (UK-NEQAS) in February 2015. © 2016 John Wiley & Sons Ltd.

  17. Heterogeneous postsurgical data analytics for predictive modeling of mortality risks in intensive care units.

    PubMed

    Yun Chen; Hui Yang

    2014-01-01

    The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.

  18. Quality specifications for articles of botanical origin from the United States Pharmacopeia.

    PubMed

    Ma, Cuiying; Oketch-Rabah, Hellen; Kim, Nam-Cheol; Monagas, Maria; Bzhelyansky, Anton; Sarma, Nandakumara; Giancaspro, Gabriel

    2018-06-01

    In order to define appropriate quality of botanical dietary supplements, botanical drugs, and herbal medicines, the United States Pharmacopeia (USP) and the Herbal Medicines Compendium (HMC) contain science-based quality standards that include multiple interrelated tests to provide a full quality characterization for each article in terms of its identity, purity, and content. To provide a comprehensive description of the pharmacopeial tests and requirements for articles of botanical origin in the aforementioned compendia. Selective chromatographic procedures, such as high-performance liquid chromatography (HPLC) and high-performance thin-layer chromatography (HPTLC), are used as Identification tests in pharmacopeial monographs to detect species substitution or other confounders. HPLC quantitative tests are typically used to determine the content of key constituents, i.e., the total or individual amount of plant secondary metabolites that are considered bioactive constituents or analytical marker compounds. Purity specifications are typically set to limit the content of contaminants such as toxic elements, pesticides, and fungal toxins. Additional requirements highlight the importance of naming, definition, use of reference materials, and packaging/storage conditions. Technical requirements for each section of the monographs were illustrated with specific examples. Tests were performed on authentic samples using pharmacopeial reference standards. The chromatographic analytical procedures were validated to provide characteristic profiles for the identity and/or accurate determination of the content of quality markers. The multiple tests included in each monograph complement each other to provide an appropriate pharmacopeial quality characterization for the botanicals used as herbal medicines and dietary supplements. The monographs provide detailed specifications for identity, content of bioactive constituents or quality markers, and limits of contaminants, adulterants, and potentially toxic substances. Additional requirements such as labeling and packaging further contribute to preserve the quality of these products. Compliance with pharmacopeial specifications should be required to ensure the reliability of botanical articles used for health care purposes. Copyright © 2018. Published by Elsevier GmbH.

  19. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  20. Flow cytometry for feline lymphoma: a retrospective study regarding pre-analytical factors possibly affecting the quality of samples.

    PubMed

    Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano

    2018-06-01

    Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for the increased use of this minimally invasive, fast and cost-effective technique in feline medicine.

  1. Techniques used for the screening of hemoglobin levels in blood donors: current insights and future directions.

    PubMed

    Chaudhary, Rajendra; Dubey, Anju; Sonker, Atul

    2017-01-01

    Blood donor hemoglobin (Hb) estimation is an important donation test that is performed prior to blood donation. It serves the dual purpose of protecting the donors' health against anemia and ensuring good quality of blood components, which has an implication on recipients' health. Diverse cutoff criteria have been defined world over depending on population characteristics; however, no testing methodology and sample requirement have been specified for Hb screening. Besides the technique, there are several physiological and methodological factors that affect accuracy and reliability of Hb estimation. These include the anatomical source of blood sample, posture of the donor, timing of sample and several other biological factors. Qualitative copper sulfate gravimetric method has been the archaic time-tested method that is still used in resource-constrained settings. Portable hemoglobinometers are modern quantitative devices that have been further modified to reagent-free cuvettes. Furthermore, noninvasive spectrophotometry was introduced, mitigating pain to blood donor and eliminating risk of infection. Notwithstanding a tremendous evolution in terms of ease of operation, accuracy, mobility, rapidity and cost, a component of inherent variability persists, which may partly be attributed to pre-analytical variables. Hence, blood centers should pay due attention to validation of test methodology, competency of operating staff and regular proficiency testing of the outputs. In this article, we have reviewed various regulatory guidelines, described the variables that affect the measurements and compared the validated technologies for Hb screening of blood donors along with enumeration of their merits and limitations.

  2. The cost-effectiveness of telestroke in the Pacific Northwest region of the USA.

    PubMed

    Nelson, Richard E; Okon, Nicholas; Lesko, Alexandra C; Majersik, Jennifer J; Bhatt, Archit; Baraban, Elizabeth

    2016-10-01

    Using real-world data from the Providence Oregon Telestroke Network, we examined the cost-effectiveness of telestroke from both the spoke and hub perspectives by level of financial responsibility for these costs and by patient stroke severity. We constructed a decision analytic model using patient-level clinical and financial data from before and after telestroke implementation. Effectiveness was measured as quality-adjusted life years (QALYs) and was combined with cost per patient outcomes to calculate incremental cost effectiveness ratios (ICERs). Outcomes were generated (a) overall; (b) by stroke severity, via the National Institute of Health Stroke Scale (NIHSS) at time of arrival, defined as low (<5), medium (5-14) and high (>15); and (c) by percentage of implementation costs paid by spokes (0%, 50%, 100%). Data for 864 patients, 98 pre- and 766 post-implementation, were used to parameterize our model. From the spoke perspective, telestroke had ICERs of US$1322/QALY, US$25,991/QALY and US$50,687/QALY when responsible for 0%, 50%, and 100% of these costs, respectively. Overall, the ICER ranged from US$22,363/QALY to US$71,703/QALY from the hub perspective. Our results support previous models showing good value, overall. However, costs and ICERs varied by stroke severity, with telestroke being most cost-effective for severe strokes. Telestroke was least cost effective for the spokes if spokes paid for more than half of implementation costs. © The Author(s) 2015.

  3. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  4. Reproducibility studies for experimental epitope detection in macrophages (EDIM).

    PubMed

    Japink, Dennis; Nap, Marius; Sosef, Meindert N; Nelemans, Patty J; Coy, Johannes F; Beets, Geerard; von Meyenfeldt, Maarten F; Leers, Math P G

    2014-05-01

    We have recently described epitope detection in macrophages (EDIM) by flow cytometry. This is a promising tool for the diagnosis and follow-up of malignancies. However, biological and technical validation is warranted before clinical applicability can be explored. The pre-analytic and analytic phases were investigated. Five different aspects were assessed: blood sample stability, intra-individual variability in healthy persons, intra-assay variation, inter-assay variation and assay transferability. The post-analytic phase was already partly standardized and described in an earlier study. The outcomes in the pre-analytic phase showed that samples are stable for 24h after venipuncture. Biological variation over time was similar to that of serum tumor marker assays; each patient has a baseline value. Intra-assay variation showed good reproducibility, while inter-assay variation showed reproducibility similar to that of to established serum tumor marker assays. Furthermore, the assay showed excellent transferability between analyzers. Under optimal analytic conditions the EDIM method is technically stable, reproducible and transferable. Biological variation over time needs further assessment in future work. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Pre-admission factors and utilization of tutoring services in health professions educational programs.

    PubMed

    Olivares-Urueta, Mayra; Williamson, Jon W

    2013-01-01

    Pre-admission factors tend to serve as indicators of student success in health professions educational programs, but less is known about the effects that academic assistance programs have on student success. This study sought to determine whether specific pre-admission factors could help to identify students who may require academic support during their health professions education. This retrospective analysis aimed to identify differences in pre-admission variables between those students requiring tutoring and a matched sample of students who did not require tutoring. One-way ANOVA was used to assess differences for dependent variables-age, cumulative GPA (cGPA), science GPA (sGPA), verbal graduate record examination (GRE) score, quantitative GRE score, analytical GRE score and combined GRE score, community college hours, average credit hours per semester, and highest semester credit hour load-across three groups of students who received no tutoring (NT 0 hrs), some tutoring (ST <8 hrs), and more tutoring (MT >8 hrs). Total GRE and average semester hours differentiated NT from ST from MT (p<0.05). A linear regression model with these pre-admission factors found only four of the independent variables to be significant (r2=0.41; p<0.05) in predicting hours of tutoring: quantitative GRE, sGPA, cGPA and average semester hours taken. The combination of lower GRE scores and lighter average semester course load were most predictive of the need for academic assistance as defined by hours of tutoring. While the value of the GRE in admissions processes is generally accepted, the average semester hour load in college can also provide important information regarding academic preparation and the need for tutoring services.

  6. Improved quality control of [18F]fluoromethylcholine.

    PubMed

    Nader, Michael; Reindl, Dietmar; Eichinger, Reinhard; Beheshti, Mohsen; Langsteger, Werner

    2011-11-01

    With respect to the broad application of [(18)F-methyl]fluorocholine (FCH), there is a need for a safe, but also efficient and convenient way for routine quality control of FCH. Therefore, a GC- method should be developed and validated which allows the simultaneous quantitation of all chemical impurities and residual solvents such as acetonitrile, ethanol, dibromomethane and N,N-dimethylaminoethanol. Analytical GC has been performed with a GC-capillary column Optima 1701 (50 m×0.32 mm), and a pre-column deactivated capillary column phenyl-Sil (10 m×0.32) in line with a flame ionization detector (FID) was used. The validation includes the following tests: specificity, range, accuracy, linearity, precision, limit of detection (LOD) and limit of quantitation (LOQ) of all listed substances. The described GC method has been successfully used for the quantitation of the listed chemical impurities. The specificity of the GC separation has been proven by demonstrating that the appearing peaks are completely separated from each other and that a resolution R≥1.5 for the separation of the peaks could be achieved. The specified range confirmed that the analytical procedure provides an acceptable degree of linearity, accuracy and precision. For each substance, a range from 2% to 120% of the specification limit could be demonstrated. The corresponding LOD values were determined and were much lower than the specification limits. An efficient and convenient GC method for the quality control of FCH has been developed and validated which meets all acceptance criteria in terms of linearity, specificity, precision, accuracy, LOD and LOQ. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. [Maturity Levels of Quality and Risk Management at the University Hospital Schleswig-Holstein].

    PubMed

    Jussli-Melchers, Jill; Hilbert, Carsten; Jahnke, Iris; Wehkamp, Kai; Rogge, Annette; Freitag-Wolf, Sandra; Kahla-Witzsch, Heike A; Scholz, Jens; Petzina, Rainer

    2018-05-16

    Quality and risk management in hospitals are not only required by law but also for an optimal patient-centered and process-optimized patient care. To evaluate the maturity levels of quality and risk management at the University Hospital Schleswig-Holstein (UKSH), a structured analytical tool was developed for easy and efficient application. Four criteria concerning quality management - quality assurance (QS), critical incident reporting system (CIRS), complaint management (BM) and process management (PM) - were evaluated with a structured questionnaire. Self-assessment and external assessment were performed to classify the maturity levels at the UKSH (location Kiel and Lübeck). Every quality item was graded into four categories from "A" (fully implemented) to "D" (not implemented at all). First of all, an external assessment was initiated by the head of the department of quality and risk management. Thereafter, a self-assessment was performed by 46 clinical units of the UKSH. Discrepancies were resolved in a collegial dialogue. Based on these data, overall maturity levels were obtained for every clinical unit. The overall maturity level "A" was reached by three out of 46 (6.5%) clinical units. No unit was graded with maturity level "D". 50% out of all units reached level "B" and 43.5% level "C". The distribution of the four different quality criteria revealed a good implementation of complaint management (maturity levels "A" and "B" in 78.3%), whereas the levels for CIRS were "C" and "D" in 73.9%. Quality assurance and process management showed quite similar distributions for the levels of maturity "B" and "C" (87% QS; 91% PM). The structured analytical tool revealed maturity levels of 46 clinical units of the UKSH and defined the maturity levels of four relevant quality criteria (QS, CIRS, BM, PM). As a consequence, extensive procedures were implemented to raise the standard of quality and risk management. In future, maturity levels will be reevaluated every two years. This qualitative maturity level model enables in a simple and efficient way precise statements concerning presence, manifestation and development of quality and risk management. © Georg Thieme Verlag KG Stuttgart · New York.

  8. Analytical evaluation of BEA zeolite for the pre-concentration of polycyclic aromatic hydrocarbons and their subsequent chromatographic analysis in water samples.

    PubMed

    Wilson, Walter B; Costa, Andréia A; Wang, Huiyong; Dias, José A; Dias, Sílvia C L; Campiglia, Andres D

    2012-07-06

    The analytical performance of BEA - a commercial zeolite - is evaluated for the pre-concentration of fifteen Environmental Protection Agency - polycyclic aromatic hydrocarbons and their subsequent HPLC analysis in tap and lake water samples. The pre-concentration factors obtained with BEA have led to a method with excellent analytical figures of merit. One milliliter aliquots were sufficient to obtain excellent precision of measurements at the parts-per-trillion concentration level with relative standard deviations varying from 4.1% (dibenzo[a,h]anthracene) to 13.4% (pyrene). The limits of detection were excellent as well and varied between 1.1 (anthracene) and 49.9 ng L(-1) (indeno[1,2,3-cd]pyrene). The recovery values of all the studied compounds meet the criterion for regulated polycyclic aromatic hydrocarbons, which mandates relative standard deviations equal or lower than 25%. The small volume of organic solvents (100 μL per sample) and amount of BEA (2 mg per sample) makes sample pre-concentration environmentally friendly and cost effective. The extraction procedure is well suited for numerous samples as the small working volume (1 mL) facilitates the implementation of simultaneous sample extraction. These are attractive features when routine monitoring of numerous samples is contemplated. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Multivariate Protein Signatures of Pre-Clinical Alzheimer's Disease in the Alzheimer's Disease Neuroimaging Initiative (ADNI) Plasma Proteome Dataset

    PubMed Central

    Johnstone, Daniel; Milward, Elizabeth A.; Berretta, Regina; Moscato, Pablo

    2012-01-01

    Background Recent Alzheimer's disease (AD) research has focused on finding biomarkers to identify disease at the pre-clinical stage of mild cognitive impairment (MCI), allowing treatment to be initiated before irreversible damage occurs. Many studies have examined brain imaging or cerebrospinal fluid but there is also growing interest in blood biomarkers. The Alzheimer's Disease Neuroimaging Initiative (ADNI) has generated data on 190 plasma analytes in 566 individuals with MCI, AD or normal cognition. We conducted independent analyses of this dataset to identify plasma protein signatures predicting pre-clinical AD. Methods and Findings We focused on identifying signatures that discriminate cognitively normal controls (n = 54) from individuals with MCI who subsequently progress to AD (n = 163). Based on p value, apolipoprotein E (APOE) showed the strongest difference between these groups (p = 2.3×10−13). We applied a multivariate approach based on combinatorial optimization ((α,β)-k Feature Set Selection), which retains information about individual participants and maintains the context of interrelationships between different analytes, to identify the optimal set of analytes (signature) to discriminate these two groups. We identified 11-analyte signatures achieving values of sensitivity and specificity between 65% and 86% for both MCI and AD groups, depending on whether APOE was included and other factors. Classification accuracy was improved by considering “meta-features,” representing the difference in relative abundance of two analytes, with an 8-meta-feature signature consistently achieving sensitivity and specificity both over 85%. Generating signatures based on longitudinal rather than cross-sectional data further improved classification accuracy, returning sensitivities and specificities of approximately 90%. Conclusions Applying these novel analysis approaches to the powerful and well-characterized ADNI dataset has identified sets of plasma biomarkers for pre-clinical AD. While studies of independent test sets are required to validate the signatures, these analyses provide a starting point for developing a cost-effective and minimally invasive test capable of diagnosing AD in its pre-clinical stages. PMID:22485168

  10. The microINR portable coagulometer: analytical quality and user-friendliness of a PT (INR) point-of-care instrument.

    PubMed

    Larsen, Pia Bükmann; Storjord, Elin; Bakke, Åsne; Bukve, Tone; Christensen, Mikael; Eikeland, Joakim; Haugen, Vegar Engeland; Husby, Kristin; McGrail, Rie; Mikaelsen, Solveig Meier; Monsen, Grete; Møller, Mette Fogh; Nybo, Jan; Revsholm, Jesper; Risøy, Aslaug Johanne; Skålsvik, Unni Marie; Strand, Heidi; Teruel, Reyes Serrano; Theodorsson, Elvar

    2017-04-01

    Regular measurement of prothrombin time as an international normalized ratio PT (INR) is mandatory for optimal and safe use of warfarin. Scandinavian evaluation of laboratory equipment for primary health care (SKUP) evaluated the microINR portable coagulometer (microINR ® ) (iLine Microsystems S.L., Spain) for measurement of PT (INR). Analytical quality and user-friendliness were evaluated under optimal conditions at an accredited hospital laboratory and at two primary health care centres (PHCCs). Patients were recruited at the outpatient clinic of the Laboratory of Medical Biochemistry, St Olav's University Hospital, Trondheim, Norway (n = 98) and from two PHCCs (n = 88). Venous blood samples were analyzed under optimal conditions on the STA-R ® Evolution with STA-SPA + reagent (Stago, France) (Owren method), and the results were compared to capillary measurements on the microINR ® . The imprecision of the microINR ® was 6% (90% CI: 5.3-7.0%) and 6.3% (90% CI: 5.1-8.3) in the outpatient clinic and PHCC2, respectively for INR ≥2.5. The microINR ® did not meet the SKUP quality requirement for imprecision ≤5.0%. For INR <2.5 at PHCC2 and at both levels in PHCC1, CV% was ≤5.0. The accuracy fulfilled the SKUP quality goal in both outpatient clinic and PHCCs. User-friendliness of the operation manual was rated as intermediate, defined by SKUP as neutral ratings assessed as neither good nor bad. Operation facilities was rated unsatisfactory, and time factors satisfactory. In conclusion, quality requirements for imprecision were not met. The SKUP criteria for accuracy was fulfilled both at the hospital and at the PHCCs. The user-friendliness was rated intermediate.

  11. The discriminant pixel approach: a new tool for the rational interpretation of GCxGC-MS chromatograms.

    PubMed

    Vial, Jérôme; Pezous, Benoît; Thiébaut, Didier; Sassiat, Patrick; Teillet, Béatrice; Cahours, Xavier; Rivals, Isabelle

    2011-01-30

    GCxGC is now recognized as the most suited analytical technique for the characterization of complex mixtures of volatile compounds; it is implemented worldwide in academic and industrial laboratories. However, in the frame of comprehensive analysis of non-target analytes, going beyond the visual examination of the color plots remains challenging for most users. We propose a strategy that aims at classifying chromatograms according to the chemical composition of the samples while determining the origin of the discrimination between different classes of samples: the discriminant pixel approach. After data pre-processing and time-alignment, the discriminatory power of each chromatogram pixel for a given class was defined as its correlation with the membership to this class. Using a peak finding algorithm, the most discriminant pixels were then linked to chromatographic peaks. Finally, crosschecking with mass spectrometry data enabled to establish relationships with compounds that could consequently be considered as candidate class markers. This strategy was applied to a large experimental data set of 145 GCxGC-MS chromatograms of tobacco extracts corresponding to three distinct classes of tobacco. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Enhancing Treatment Outcome of Patients at Risk of Treatment Failure: Meta-Analytic and Mega-Analytic Review of a Psychotherapy Quality Assurance System

    ERIC Educational Resources Information Center

    Shimokawa, Kenichi; Lambert, Michael J.; Smart, David W.

    2010-01-01

    Objective: Outcome research has documented worsening among a minority of the patient population (5% to 10%). In this study, we conducted a meta-analytic and mega-analytic review of a psychotherapy quality assurance system intended to enhance outcomes in patients at risk of treatment failure. Method: Original data from six major studies conducted…

  13. Use of a novel cation-exchange restricted-access material for automated sample clean-up prior to the determination of basic drugs in plasma by liquid chromatography.

    PubMed

    Chiap, P; Rbeida, O; Christiaens, B; Hubert, Ph; Lubda, D; Boos, K S; Crommen, J

    2002-10-25

    A new kind of silica-based restricted-access material (RAM) has been tested in pre-columns for the on-line solid-phase extraction (SPE) of basic drugs from directly injected plasma samples before their quantitative analysis by reversed-phase liquid chromatography (LC), using the column switching technique. The outer surface of the porous RAM particlescontains hydrophilic diol groups while sulphonic acid groups are bound to the internal surface, which gives the sorbent the properties of a strong cation exchanger towards low molecular mass compounds. Macromolecules such as proteins have no access to the internal surface of the pre-column due to their exclusion from the pores and are then flushed directly out. The retention capability of this novel packing material has been tested for some hydrophilic basic drugs, such as atropine, fenoterol, ipratropium, procaine, sotalol and terbutaline, used as model compounds. The influence of the composition of the washing liquid on the retention of the analytes in the pre-column has been investigated. The elution profiles of the different compounds and the plasma matrix as well as the time needed for the transfer of the analytes from the pre-column to the analytical column were determined in order to deduce the most suitable conditions for the clean-up step and develop on-line methods for the LC determination of these compounds in plasma. The cationic exchange sorbent was also compared to another RAM, namely RP-18 ADS (alkyl diol silica) sorbent with respect to retention capability towards basic analytes.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holladay, S.K.; Anderson, H.M.; Benson, S.B.

    Quality assurance (QA) objectives for Phase 2 were that (1) scientific data generated would withstand scientific and legal scrutiny; (2) data would be gathered using appropriate procedures for sample collection, sample handling and security, chain of custody, laboratory analyses, and data reporting; (3) data would be of known precision and accuracy; and (4) data would meet data quality objectives defined in the Phase 2 Sampling and Analysis Plan. A review of the QA systems and quality control (QC) data associated with the Phase 2 investigation is presented to evaluate whether the data were of sufficient quality to satisfy Phase 2more » objectives. The data quality indicators of precision, accuracy, representativeness, comparability, completeness, and sensitivity were evaluated to determine any limitations associated with the data. Data were flagged with qualifiers that were associated with appropriate reason codes and documentation relating the qualifiers to the reviewer of the data. These qualifiers were then consolidated into an overall final qualifier to represent the quality of the data to the end user. In summary, reproducible, precise, and accurate measurements consistent with CRRI objectives and the limitations of the sampling and analytical procedures used were obtained for the data collected in support of the Phase 2 Remedial Investigation.« less

  15. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  16. Self-Regulatory Fatigue, Quality of Life, Health Behaviors, and Coping in Patients with Hematologic Malignancies

    PubMed Central

    Ehlers, Shawna L.; Patten, Christi A.; Gastineau, Dennis A.

    2015-01-01

    Background Self-regulatory fatigue may play an important role in a complex medical illness. Purpose Examine associations between self-regulatory fatigue, quality of life, and health behaviors in patients pre- (N=213) and 1-year post-hematopoietic stem cell transplantation (HSCT; N=140). Associations between self-regulatory fatigue and coping strategies pre-HSCT were also examined. Method Pre- and 1-year post-HSCT data collection. Hierarchical linear regression modeling. Results Higher self-regulatory fatigue pre-HSCT associated with lower overall, physical, social, emotional, and functional quality of life pre- (p’s<.001) and 1-year post-HSCT (p’s<.01); lower physical activity pre-HSCT (p<.02) and post-HSCT (p<.03) and less healthy nutritional intake post-HSCT (p<.01); changes (i.e., decrease) in quality of life and healthy nutrition over the follow-up year; and use of avoidance coping strategies pre-HSCT (p’s<.001). Conclusion This is the first study to show self-regulatory fatigue pre-HSCT relating to decreased quality of life and health behaviors, and predicting changes in these variables 1-year post-HSCT. PMID:24802991

  17. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Harmonization in laboratory medicine: Requests, samples, measurements and reports.

    PubMed

    Plebani, Mario

    2016-01-01

    In laboratory medicine, the terms "standardization" and "harmonization" are frequently used interchangeably as the final goal is the same: the equivalence of measurement results among different routine measurement procedures over time and space according to defined analytical and clinical quality specifications. However, the terms define two distinct, albeit closely linked, concepts based on traceability principles. The word "standardization" is used when results for a measurement are equivalent and traceable to the International System of Units (SI) through a high-order primary reference material and/or a reference measurement procedure (RMP). "Harmonization" is generally used when results are equivalent, but neither a high-order primary reference material nor a reference measurement procedure is available. Harmonization is a fundamental aspect of quality in laboratory medicine as its ultimate goal is to improve patient outcomes through the provision of accurate and actionable laboratory information. Patients, clinicians and other healthcare professionals assume that clinical laboratory tests performed by different laboratories at different times on the same sample and specimen can be compared, and that results can be reliably and consistently interpreted. Unfortunately, this is not necessarily the case, because many laboratory test results are still highly variable and poorly standardized and harmonized. Although the initial focus was mainly on harmonizing and standardizing analytical processes and methods, the scope of harmonization now also includes all other aspects of the total testing process (TTP), such as terminology and units, report formats, reference intervals and decision limits as well as tests and test profiles, requests and criteria for interpretation. Several projects and initiatives aiming to improve standardization and harmonization in the testing process are now underway. Laboratory professionals should therefore step up their efforts to provide interchangeable and comparable laboratory information in order to ultimately assure better diagnosis and treatment in patient care.

  19. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    PubMed

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the experimental conditions are drawn. This work shows that SPE is a convenient technique for TBT pre-concentration at pico-trace levels and a robust approach: in fact (i) number of different experimental conditions led to satisfactory results and (ii) the participation of two institutes to the experimental work did not impact the developed model. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. An Analytical Hierarchy Process Model for the Evaluation of College Experimental Teaching Quality

    ERIC Educational Resources Information Center

    Yin, Qingli

    2013-01-01

    Taking into account the characteristics of college experimental teaching, through investigaton and analysis, evaluation indices and an Analytical Hierarchy Process (AHP) model of experimental teaching quality have been established following the analytical hierarchy process method, and the evaluation indices have been given reasonable weights. An…

  1. 21 CFR 165.110 - Bottled water.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    .... (3) Physical quality. Bottled water shall, when a composite of analytical units of equal volume from.... 1 (4) Chemical quality. (i)(A) Bottled water shall, when a composite of analytical units of equal... bottled water, when a composite of analytical units of equal volume from a sample is examined by the...

  2. Q selection for an electro-optical earth imaging system: theoretical and experimental results.

    PubMed

    Cochrane, Andy; Schulz, Kevin; Kendrick, Rick; Bell, Ray

    2013-09-23

    This paper explores practical design considerations for selecting Q for an electro-optical earth imaging system, where Q is defined as (λ FN) / pixel pitch. Analytical methods are used to show that, under imaging conditions with high SNR, increasing Q with fixed aperture cannot lead to degradation of image quality regardless of the angular smear rate of the system. The potential for degradation of image quality under low SNR is bounded by an increase of the detector noise scaling as Q. An imaging test bed is used to collect representative imagery for various Q configurations. The test bed includes real world errors such as image smear and haze. The value of Q is varied by changing the focal length of the imaging system. Imagery is presented over a broad range of parameters.

  3. Dual nozzle aerodynamic and cooling analysis study

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1981-01-01

    Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.

  4. The University of Arizona program in solid propellants

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar

    1989-01-01

    The University of Arizona program is aimed at introducing scientific rigor to the predictability and quality assurance of composite solid propellants. Two separate approaches are followed: to use the modern analytical techniques to experimentally study carefully controlled propellant batches to discern trends in mixing, casting, and cure; and to examine a vast bank of data, that has fairly detailed information on the ingredients, processing, and rocket firing results. The experimental and analytical work is described briefly. The principle findings were that: (1) pre- (dry) blending of the coarse and fine ammonium perchlorate can significantly improve the uniformity of mixing; (2) the Fourier transformed IR spectra of the uncured and cured polymer have valuable data on the state of the fuel; (3) there are considerable non-uniformities in the propellant slurry composition near the solid surfaces (blades, walls) compared to the bulk slurry; and (4) in situ measurements of slurry viscosity continuously during mixing can give a good indication of the state of the slurry. Several important observations in the study of the data bank are discussed.

  5. MicroRNA Based Liquid Biopsy: The Experience of the Plasma miRNA Signature Classifier (MSC) for Lung Cancer Screening.

    PubMed

    Mensah, Mavis; Borzi, Cristina; Verri, Carla; Suatoni, Paola; Conte, Davide; Pastorino, Ugo; Orazio, Fortunato; Sozzi, Gabriella; Boeri, Mattia

    2017-10-26

    The development of a minimally invasive test, such as liquid biopsy, for early lung cancer detection in its preclinical phase is crucial to improve the outcome of this deadly disease. MicroRNAs (miRNAs) are tissue specific, small, non-coding RNAs regulating gene expression, which may act as extracellular messengers of biological signals derived from the cross-talk between the tumor and its surrounding microenvironment. They could thus represent ideal candidates for early detection of lung cancer. In this work, a methodological workflow for the prospective validation of a circulating miRNA test using custom made microfluidic cards and quantitative Real-Time PCR in plasma samples of volunteers enrolled in a lung cancer screening trial is proposed. In addition, since the release of hemolysis-related miRNAs and more general technical issues may affect the analysis, the quality control steps included in the standard operating procedures are also presented. The protocol is reproducible and gives reliable quantitative results; however, when using large clinical series, both pre-analytical and analytical features should be cautiously evaluated.

  6. On-line sample cleanup and enrichment chromatographic technique for the determination of ambroxol in human serum.

    PubMed

    Emara, Samy; Kamal, Maha; Abdel Kawi, Mohamed

    2012-02-01

    A sensitive and efficient on-line clean up and pre-concentration method has been developed using column-switching technique and protein-coated µ-Bondapak CN silica pre-column for quantification of ambroxol (AM) in human serum. The method is performed by direct injection of serum sample onto a protein-coated µ-Bondapak CN silica pre-column, where AM is pre-concentrated and retained, while proteins and very polar constituents are washed to waste using a phosphate buffer saline (pH 7.4). The retained analyte on the pre-column is directed onto a C(18) analytical column for separation, with a mobile phase consisting of a mixture of methanol and distilled deionized water (containing 1% triethylamine adjusted to pH 3.5 with ortho-phosphoric acid) in the ratio of 50:50 (v/v). Detection is performed at 254 nm. The calibration curve is linear over the concentration range of 12-120 ng/mL (r(2) = 0.9995). The recovery, selectivity, linearity, precision, and accuracy of the method are convenient for pharmacokinetic studies or routine assays.

  7. Medical immunology: two-way bridge connecting bench and bedside.

    PubMed

    Rijkers, Ger T; Damoiseaux, Jan G M C; Hooijkaas, Herbert

    2014-12-01

    Medical immunology in The Netherlands is a laboratory specialism dealing with immunological analyses as well as pre- and post-analytical consultation to clinicians (clinical immunologists and other specialists) involved in patients with immune mediated diseases. The scope of medical immunology includes immunodeficiencies, autoimmune diseases, allergy, transfusion and transplantation immunology, and lymphoproliferative disorders plus the monitoring of these patients. The training, professional criteria, quality control of procedures and laboratories is well organized. As examples of the bridge function of medical immunology between laboratory (bench) and patient (bedside) the contribution of medical immunologists to diagnosis and treatment of primary immunodeficiency diseases (in particular: humoral immunodeficiencies) as well as autoantibodies (anti-citrullinated proteins in rheumatoid arthritis) are given. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Sexuality and psychoanalytic aggrandisement: Freud's 1908 theory of cultural history.

    PubMed

    Cotti, Patricia

    2011-03-01

    In 1908, in his article "'Civilized" sexual morality and modern nervous illness', Freud presented neuroses as the consequence of a restrictive state of cultural development and its 'civilized morality'. He found the inspiration for this idea by expanding upon previous formulations in this area by his predecessors (notably Christian von Ehrenfels) that focused on a cultural process earlier introduced by Kant, while also integrating in his analysis the principles of Haeckel's evolutionism (history of development, recapitulation) which eventually re-defined the psychoanalytic theory of neuroses. These new theoretical elements became the basis of psychoanalytic theory and thereby influenced subsequent thinking in the cultural process itself and in human sciences. This transformation of underlying theory provided a unique historical and analytical framework for psychoanalysis which allowed Freud to claim for it a pre-eminent position among the human sciences.

  9. PET Image Reconstruction Incorporating 3D Mean-Median Sinogram Filtering

    NASA Astrophysics Data System (ADS)

    Mokri, S. S.; Saripan, M. I.; Rahni, A. A. Abd; Nordin, A. J.; Hashim, S.; Marhaban, M. H.

    2016-02-01

    Positron Emission Tomography (PET) projection data or sinogram contained poor statistics and randomness that produced noisy PET images. In order to improve the PET image, we proposed an implementation of pre-reconstruction sinogram filtering based on 3D mean-median filter. The proposed filter is designed based on three aims; to minimise angular blurring artifacts, to smooth flat region and to preserve the edges in the reconstructed PET image. The performance of the pre-reconstruction sinogram filter prior to three established reconstruction methods namely filtered-backprojection (FBP), Maximum likelihood expectation maximization-Ordered Subset (OSEM) and OSEM with median root prior (OSEM-MRP) is investigated using simulated NCAT phantom PET sinogram as generated by the PET Analytical Simulator (ASIM). The improvement on the quality of the reconstructed images with and without sinogram filtering is assessed according to visual as well as quantitative evaluation based on global signal to noise ratio (SNR), local SNR, contrast to noise ratio (CNR) and edge preservation capability. Further analysis on the achieved improvement is also carried out specific to iterative OSEM and OSEM-MRP reconstruction methods with and without pre-reconstruction filtering in terms of contrast recovery curve (CRC) versus noise trade off, normalised mean square error versus iteration, local CNR versus iteration and lesion detectability. Overall, satisfactory results are obtained from both visual and quantitative evaluations.

  10. How do student nurses learn to care? An analysis of pre-registration adult nursing practice assessment documents.

    PubMed

    Young, Kate; Godbold, Rosemary; Wood, Pat

    2018-01-01

    There is international concern about the quality of nursing in resource constrained, high technology health care settings. This paper reports findings from a research study which explored the experiences and views of those involved in the education and learning of 'caring' with adult pre-registration students. A novel dataset of 39 practice assessment documents (PADs) were randomly sampled and analysed across both bachelors and masters programmes from September 2014-July 2015. Using an appreciative enquiry approach, the Caring Behaviours Inventory aided analysis of qualitative text from both mentors and students within the PADs to identify how student nurses learn to care and to establish whether there were any differences between Masters and Bachelors students. In contrast with existing research, we found a holistic, melded approach to caring. This combined softer skills with highly technologized care, and flexible, tailored approaches to optimise individualised care delivery. Both of these were highly valued by both students and mentors. Pre-registration MSc students tended to have higher perceptual skills and be more analytical than their BSc counterparts. We found no evidence to suggest that caring behaviour or attitudes diminish over the course of either programme. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  12. Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN

    NASA Technical Reports Server (NTRS)

    Griffis, H.

    1985-01-01

    Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.

  13. Eco-innovative design approach: Integrating quality and environmental aspects in prioritizing and solving engineering problems

    NASA Astrophysics Data System (ADS)

    Chakroun, Mahmoud; Gogu, Grigore; Pacaud, Thomas; Thirion, François

    2014-09-01

    This study proposes an eco-innovative design process taking into consideration quality and environmental aspects in prioritizing and solving technical engineering problems. This approach provides a synergy between the Life Cycle Assessment (LCA), the nonquality matrix, the Theory of Inventive Problem Solving (TRIZ), morphological analysis and the Analytical Hierarchy Process (AHP). In the sequence of these tools, LCA assesses the environmental impacts generated by the system. Then, for a better consideration of environmental aspects, a new tool is developed, the non-quality matrix, which defines the problem to be solved first from an environmental point of view. The TRIZ method allows the generation of new concepts and contradiction resolution. Then, the morphological analysis offers the possibility of extending the search space of solutions in a design problem in a systematic way. Finally, the AHP identifies the promising solution(s) by providing a clear logic for the choice made. Their usefulness has been demonstrated through their application to a case study involving a centrifugal spreader with spinning discs.

  14. Couple relationship quality and offspring attachment security: a systematic review with meta-analysis.

    PubMed

    Tan, Evelyn S; McIntosh, Jennifer E; Kothe, Emily J; Opie, Jessica E; Olsson, Craig A

    2018-08-01

    This paper provides a meta-analytic examination of strength and direction of association between parents' couple relationship quality and early childhood attachment security (5 years and under). A comprehensive search of four EBSCOhost databases, Informit, Web of Science, and grey literature yielded 24 studies meeting eligibility criteria. Heterogeneity of the couple quality construct and measurement was marked. To disaggregate potentially differentially acting factors, we grouped homogeneous studies, creating two predictor variables defined as "positive dyadic adjustment" and "inter-parental conflict". Associations of each construct with offspring attachment security were examined in two separate meta-analyses. Inter-parental conflict was inversely associated (8 studies, k = 17, r = -0.28, CI = [-0.39 to -0.18]), and dyadic adjustment was not associated with offspring attachment security (5 studies, k = 12, r = 0.14, CI = [-0.03 to 0.32]). The study supports finer distinctions of couple relationship constructs and measurement in developmental research, assessment, and intervention.

  15. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  16. Quality of Big Data in health care.

    PubMed

    Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K

    2015-01-01

    The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.

  17. Science meets regulation.

    PubMed

    Bilia, Anna Rita

    2014-12-02

    The European Pharmacopoeia (Ph. Eur.) is a standard reference for both European and non-European countries and defines requirements for the qualitative and quantitative composition of medicines. Herbal drug (HD) monographs state which aspects have to be considered for quality assurance through the relevant chapters "Definition", "Characters", "Identification", "Tests", and "Assay". Identification of botanical material is achieved by macroscopic and microscopic morphology, generally examined by a trained expert. Content or assay is the most difficult area of quality control to perform, since in most herbal drugs the active constituents are unknown and markers should be used which cannot be really related to the quality. The other critical points are represented by the purity tests, in particular some tests such as heavy metals, aflatoxins and pesticides are laborious and time intensive, requiring a significant investment in equipment, materials, and maintenance. A literature survey concerning alternative and/or complementary tools for quality control of botanicals has been performed by searching the scientific databases Pubmed, SciFinder, Scopus and Web of Science. Diverse analytical methods including DNA fingerprinting, Nuclear Magnetic Resonance (NMR), Near Infra Red (NIR) and (bio)sensors have been reported in the literature to evaluate the quality of botanical products. Identification of plants at the species level can be successfully based on genome-based methods, using DNA barcodes, the nucleotide sequence of a short DNA fragment. NMR can provide direct NMR fingerprint determination (complete assignment of the signals by 1D and 2D experiments), quantitative NMR and chemometric analysis (the metabolite fingerprint is based on the distribution of intensity in the NMR spectrum to provide sample classification). NIR spectroscopy is a fast qualitative and quantitative analytical method, getting knowledge about plant species and/or its geographic origin. Finally, the development of chemical and biological sensors is currently one of the most active areas of analytical research. Immobilization of specific enzymes led to recognize definite class of compounds such as cysteine sulfoxides, glucosinolates, cyanogenic glycosides, and polyphenols. Other recognition elements are nucleic acids to evaluate the ability of different molecules to bind DNA. Sensors have also been developed for the detection of heavy metals in botanicals. Moreover, the analysis of mycotoxins and pesticides, could represent another field of possible application. These alternative/complementary analytical methods represent tools which appear to be an analyst's dream: they are able to give rapid analysis responses; to operate directly on complex matrices, in many cases; to be selective and sensitive enough for the required application; to be portable and sometimes also disposable; and to have fast analysis times. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    PubMed

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  19. Aerial photography flight quality assessment with GPS/INS and DEM data

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao

    2018-01-01

    The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.

  20. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    NASA Astrophysics Data System (ADS)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  1. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  2. Flashback resistant pre-mixer assembly

    DOEpatents

    Laster, Walter R [Oviedo, FL; Gambacorta, Domenico [Oviedo, FL

    2012-02-14

    A pre-mixer assembly associated with a fuel supply system for mixing of air and fuel upstream from a main combustion zone in a gas turbine engine. The pre-mixer assembly includes a swirler assembly disposed about a fuel injector of the fuel supply system and a pre-mixer transition member. The swirler assembly includes a forward end defining an air inlet and an opposed aft end. The pre-mixer transition member has a forward end affixed to the aft end of the swirler assembly and an opposed aft end defining an outlet of the pre-mixer assembly. The aft end of the pre-mixer transition member is spaced from a base plate such that a gap is formed between the aft end of the pre-mixer transition member and the base plate for permitting a flow of purge air therethrough to increase a velocity of the air/fuel mixture exiting the pre-mixer assembly.

  3. A Systematic Review and Meta-Analysis of Bone Marrow Derived Mononuclear Cells in Animal Models of Ischemic Stroke

    PubMed Central

    Vahidy, Farhaan S.; Rahbar, Mohammad H.; Zhu, Hongjian; Rowan, Paul J.; Bambhroliya, Arvind B.; Savitz, Sean I.

    2016-01-01

    Background and Purpose Bone marrow derived mononuclear cells (BMMNCs) offer the promise of augmenting post-stroke recovery. There is mounting evidence of safety and efficacy of BMMNCs from pre-clinical studies of ischemic stroke (IS), however their pooled effects have not been described. Methods Using PRIMSA guidelines, we conducted a systematic review of pre-clinical literature for intravenous use of BMMNCs followed by meta-analyses of histological and behavioral outcomes. Studies were selected based on pre-defined criteria. Data were abstracted by two independent investigators. Following quality assessment, the pooled effects were generated using mixed effect models. Impact of possible biases on estimated effect size was evaluated. Results Standardized mean difference (SMD), 95% confidence interval (CI) for reduction in lesion volume was significantly beneficial for BMMNC treatment (SMD −3.3, 95% CI: −4.3, −2.3), n = 113 each for BMMNC and controls. BMMNC treated animals (n = 161) also had improved function measured by cylinder test (SMD −2.4, 95% CI: −3.1, −1.6), as compared to controls (n = 205). A trend for benefit was observed for adhesive removal test and neurological deficit score. Study quality score (median: 6, Q1-Q3: 5-7) was correlated with year of publication. There was funnel plot asymmetry, however the pooled effects were robust to the correction of this bias and remained significant in favor of BMMNC treatment. Conclusions BMMNCs demonstrate beneficial effects across histological and behavioral outcomes in animal IS models. Though study quality has improved over time, considerable degree of heterogeneity calls for standardization in the conduct and reporting of experimentation. PMID:27165959

  4. DDT-based indoor residual spraying suboptimal for visceral leishmaniasis elimination in India

    PubMed Central

    Coleman, Michael; Foster, Geraldine M.; Deb, Rinki; Pratap Singh, Rudra; Ismail, Hanafy M.; Shivam, Pushkar; Ghosh, Ayan Kumar; Dunkley, Sophie; Kumar, Vijay; Coleman, Marlize; Hemingway, Janet; Paine, Mark J. I.; Das, Pradeep

    2015-01-01

    Indoor residual spraying (IRS) is used to control visceral leishmaniasis (VL) in India, but it is poorly quality assured. Quality assurance was performed in eight VL endemic districts in Bihar State, India, in 2014. Residual dichlorodiphenyltrichloroethane (DDT) was sampled from walls using Bostik tape discs, and DDT concentrations [grams of active ingredient per square meter (g ai/m2)] were determined using HPLC. Pre-IRS surveys were performed in three districts, and post-IRS surveys were performed in eight districts. A 20% threshold above and below the target spray of 1.0 g ai/m2 was defined as “in range.” The entomological assessments were made in four districts in IRS and non-IRS villages. Vector densities were measured: pre-IRS and 1 and 3 mo post-IRS. Insecticide susceptibility to 4% DDT and 0.05% deltamethrin WHO-impregnated papers was determined with wild-caught sand flies. The majority (329 of 360, 91.3%) of pre-IRS samples had residual DDT concentrations of <0.1 g ai/m2. The mean residual concentration of DDT post-IRS was 0.37 g ai/m2; 84.9% of walls were undersprayed, 7.4% were sprayed in range, and 7.6% were oversprayed. The abundance of sand flies in IRS and non-IRS villages was significantly different at 1 mo post-IRS only. Sand flies were highly resistant to DDT but susceptible to deltamethrin. The Stockholm Convention, ratified by India in 2006, calls for the complete phasing out of DDT as soon as practical, with limited use in the interim where no viable IRS alternatives exist. Given the poor quality of the DDT-based IRS, ready availability of pyrethroids, and susceptibility profile of Indian sand flies, the continued use of DDT in this IRS program is questionable. PMID:26124110

  5. DDT-based indoor residual spraying suboptimal for visceral leishmaniasis elimination in India.

    PubMed

    Coleman, Michael; Foster, Geraldine M; Deb, Rinki; Pratap Singh, Rudra; Ismail, Hanafy M; Shivam, Pushkar; Ghosh, Ayan Kumar; Dunkley, Sophie; Kumar, Vijay; Coleman, Marlize; Hemingway, Janet; Paine, Mark J I; Das, Pradeep

    2015-07-14

    Indoor residual spraying (IRS) is used to control visceral leishmaniasis (VL) in India, but it is poorly quality assured. Quality assurance was performed in eight VL endemic districts in Bihar State, India, in 2014. Residual dichlorodiphenyltrichloroethane (DDT) was sampled from walls using Bostik tape discs, and DDT concentrations [grams of active ingredient per square meter (g ai/m(2))] were determined using HPLC. Pre-IRS surveys were performed in three districts, and post-IRS surveys were performed in eight districts. A 20% threshold above and below the target spray of 1.0 g ai/m(2) was defined as "in range." The entomological assessments were made in four districts in IRS and non-IRS villages. Vector densities were measured: pre-IRS and 1 and 3 mo post-IRS. Insecticide susceptibility to 4% DDT and 0.05% deltamethrin WHO-impregnated papers was determined with wild-caught sand flies. The majority (329 of 360, 91.3%) of pre-IRS samples had residual DDT concentrations of <0.1 g ai/m(2). The mean residual concentration of DDT post-IRS was 0.37 g ai/m(2); 84.9% of walls were undersprayed, 7.4% were sprayed in range, and 7.6% were oversprayed. The abundance of sand flies in IRS and non-IRS villages was significantly different at 1 mo post-IRS only. Sand flies were highly resistant to DDT but susceptible to deltamethrin. The Stockholm Convention, ratified by India in 2006, calls for the complete phasing out of DDT as soon as practical, with limited use in the interim where no viable IRS alternatives exist. Given the poor quality of the DDT-based IRS, ready availability of pyrethroids, and susceptibility profile of Indian sand flies, the continued use of DDT in this IRS program is questionable.

  6. ICU Telemedicine Program Financial Outcomes.

    PubMed

    Lilly, Craig M; Motzkus, Christine; Rincon, Teresa; Cody, Shawn E; Landry, Karen; Irwin, Richard S

    2017-02-01

    ICU telemedicine improves access to high-quality critical care, has substantial costs, and can change financial outcomes. Detailed information about financial outcomes and their trends over time following ICU telemedicine implementation and after the addition of logistic center function has not been published to our knowledge. Primary data were collected for consecutive adult patients of a single academic medical center. We compared clinical and financial outcomes across three groups that differed regarding telemedicine support: a group without ICU telemedicine support (pre-ICU intervention group), a group with ICU telemedicine support (ICU telemedicine group), and an ICU telemedicine group with added logistic center functions and support for quality-care standardization (logistic center group). The primary outcome was annual direct contribution margin defined as aggregated annual case revenue minus annual case direct costs (including operating costs of ICU telemedicine and its related programs). All monetary values were adjusted to 2015 US dollars using Producer Price Index for Health-Care Facilities. Annual case volume increased from 4,752 (pre-ICU telemedicine) to 5,735 (ICU telemedicine) and 6,581 (logistic center). The annual direct contribution margin improved from $7,921,584 (pre-ICU telemedicine) to $37,668,512 (ICU telemedicine) to $60,586,397 (logistic center) due to increased case volume, higher case revenue relative to direct costs, and shorter length of stay. The ability of properly modified ICU telemedicine programs to increase case volume and access to high-quality critical care with improved annual direct contribution margins suggests that there is a financial argument to encourage the wider adoption of ICU telemedicine. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  7. Putting the ‘patient’ in patient safety: a qualitative study of consumer experiences

    PubMed Central

    Rathert, Cheryl; Brandt, Julie; Williams, Eric S.

    2011-01-01

    Abstract Background  Although patient safety has been studied extensively, little research has directly examined patient and family (consumer) perceptions. Evidence suggests that clinicians define safety differently from consumers, e.g. clinicians focus more on outcomes, whereas consumers may focus more on processes. Consumer perceptions of patient safety are important for several reasons. First, health‐care policy leaders have been encouraging patients and families to take a proactive role in ensuring patient safety; therefore, an understanding of how patients define safety is needed. Second, consumer perceptions of safety could influence outcomes such as trust and satisfaction or compliance with treatment protocols. Finally, consumer perspectives could be an additional lens for viewing complex systems and processes for quality improvement efforts. Objectives  To qualitatively explore acute care consumer perceptions of patient safety. Design and methods  Thirty‐nine individuals with a recent overnight hospital visit participated in one of four group interviews. Analysis followed an interpretive analytical approach. Results  Three basic themes were identified: Communication, staffing issues and medication administration. Consumers associated care process problems, such as delays or lack of information, with safety rather than as service quality problems. Participants agreed that patients need family caregivers as advocates. Conclusions  Consumers seem acutely aware of care processes they believe pose risks to safety. Perceptual measures of patient safety and quality may help to identify areas where there are higher risks of preventable adverse events. PMID:21624026

  8. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    PubMed

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Will Public Pre-K Really Close Achievement Gaps? Gaps in Prekindergarten Quality between Students and across States

    ERIC Educational Resources Information Center

    Valentino, Rachel

    2018-01-01

    Publicly funded pre-K is often touted as a means to narrow achievement gaps, but this goal is less likely to be achieved if poor and/or minority children do not, at a minimum, attend equal quality pre-K as their non-poor, non-minority peers. In this paper, I find large "quality gaps" in public pre-K between poor, minority students and…

  10. Implementation of standardization in clinical practice: not always an easy task.

    PubMed

    Panteghini, Mauro

    2012-02-29

    As soon as a new reference measurement system is adopted, clinical validation of correctly calibrated commercial methods should take place. Tracing back the calibration of routine assays to a reference system can actually modify the relation of analyte results to existing reference intervals and decision limits and this may invalidate some of the clinical decision-making criteria currently used. To maintain the accumulated clinical experience, the quantitative relationship to the previous calibration system should be established and, if necessary, the clinical decision-making criteria should be adjusted accordingly. The implementation of standardization should take place in a concerted action of laboratorians, manufacturers, external quality assessment scheme organizers and clinicians. Dedicated meetings with manufacturers should be organized to discuss the process of assay recalibration and studies should be performed to obtain convincing evidence that the standardization works, improving result comparability. Another important issue relates to the surveillance of the performance of standardized assays through the organization of appropriate analytical internal and external quality controls. Last but not least, uncertainty of measurement that fits for this purpose must be defined across the entire traceability chain, starting with the available reference materials, extending through the manufacturers and their processes for assignment of calibrator values and ultimately to the final result reported to clinicians by laboratories.

  11. Quality-assurance results for routine water analyses in U.S. Geological Survey laboratories, water year 1998

    USGS Publications Warehouse

    Ludtke, Amy S.; Woodworth, Mark T.; Marsh, Philip S.

    2000-01-01

    The U.S. Geological Survey operates a quality-assurance program based on the analyses of reference samples for two laboratories: the National Water Quality Laboratory and the Quality of Water Service Unit. Reference samples that contain selected inorganic, nutrient, and low-level constituents are prepared and submitted to the laboratory as disguised routine samples. The program goal is to estimate precision and bias for as many analytical methods offered by the participating laboratories as possible. Blind reference samples typically are submitted at a rate of 2 to 5 percent of the annual environmental-sample load for each constituent. The samples are distributed to the laboratories throughout the year. The reference samples are subject to the identical laboratory handling, processing, and analytical procedures as those applied to environmental samples and, therefore, have been used as an independent source to verify bias and precision of laboratory analytical methods and ambient water-quality measurements. The results are stored permanently in the National Water Information System and the Blind Sample Project's data base. During water year 1998, 95 analytical procedures were evaluated at the National Water Quality Laboratory and 63 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic and low-level constituent data for water year 1998 indicated 77 of 78 analytical procedures at the National Water Quality Laboratory met the criteria for precision. Silver (dissolved, inductively coupled plasma-mass spectrometry) was determined to be imprecise. Five of 78 analytical procedures showed bias throughout the range of reference samples: chromium (dissolved, inductively coupled plasma-atomic emission spectrometry), dissolved solids (dissolved, gravimetric), lithium (dissolved, inductively coupled plasma-atomic emission spectrometry), silver (dissolved, inductively coupled plasma-mass spectrometry), and zinc (dissolved, inductively coupled plasma-mass spectrometry). At the National Water Quality Laboratory during water year 1998, lack of precision was indicated for 2 of 17 nutrient procedures: ammonia as nitrogen (dissolved, colorimetric) and orthophosphate as phosphorus (dissolved, colorimetric). Bias was indicated throughout the reference sample range for ammonia as nitrogen (dissolved, colorimetric, low level) and nitrate plus nitrite as nitrogen (dissolved, colorimetric, low level). All analytical procedures tested at the Quality of Water Service Unit during water year 1998 met the criteria for precision. One of the 63 analytical procedures indicated a bias throughout the range of reference samples: aluminum (whole-water recoverable, inductively coupled plasma-atomic emission spectrometry, trace).

  12. Perceptions of Pre-Service Teachers on Student Burnout, Occupational Anxiety and Faculty Life Quality

    ERIC Educational Resources Information Center

    Türkoglu, Muhammet Emin; Cansoy, Ramazan

    2017-01-01

    Perceptions of pre-service teachers on burnout, occupational anxiety and faculty life quality were investigated in this research. The research group consisted of 461 pre-service teachers in total studying at Afyon Kocatepe University faculty of education. "Maslach Burnout Inventory-Student Form," "Faculty Life Quality Scale"…

  13. Post-standardization of routine creatinine assays: are they suitable for clinical applications.

    PubMed

    Jassam, Nuthar; Weykamp, Cas; Thomas, Annette; Secchiero, Sandra; Sciacovelli, Laura; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Perich, Carmen; Ricós, Carmen; Paula, Faria A; Barth, Julian H

    2017-05-01

    Introduction Reliable serum creatinine measurements are of vital importance for the correct classification of chronic kidney disease and early identification of kidney injury. The National Kidney Disease Education Programme working group and other groups have defined clinically acceptable analytical limits for creatinine methods. The aim of this study was to re-evaluate the performance of routine creatinine methods in the light of these defined limits so as to assess their suitability for clinical practice. Method In collaboration with the Dutch External Quality Assurance scheme, six frozen commutable samples, with a creatinine concentration ranging from 80 to 239  μmol/L and traceable to isotope dilution mass spectrometry, were circulated to 91 laboratories in four European countries for creatinine measurement and estimated glomerular filtration rate calculation. Two out of the six samples were spiked with glucose to give high and low final concentrations of glucose. Results Results from 89 laboratories were analysed for bias, imprecision (%CV) for each creatinine assay and total error for estimated glomerular filtration rate. The participating laboratories used analytical instruments from four manufacturers; Abbott, Beckman, Roche and Siemens. All enzymatic methods in this study complied with the National Kidney Disease Education Programme working group recommended limits of bias of 5% above a creatinine concentration of 100  μmol/L. They also did not show any evidence of interference from glucose. In addition, they also showed compliance with the clinically recommended %CV of ≤4% across the analytical range. In contrast, the Jaffe methods showed variable performance with regard to the interference of glucose and unsatisfactory bias and precision. Conclusion Jaffe-based creatinine methods still exhibit considerable analytical variability in terms of bias, imprecision and lack of specificity, and this variability brings into question their clinical utility. We believe that clinical laboratories and manufacturers should work together to phase out the use of relatively non-specific Jaffe methods and replace them with more specific methods that are enzyme based.

  14. Clinical Laboratory Practice Recommendations for the Use of Cardiac Troponin in Acute Coronary Syndrome: Expert Opinion from the Academy of the American Association for Clinical Chemistry and the Task Force on Clinical Applications of Cardiac Bio-Markers of the International Federation of Clinical Chemistry and Laboratory Medicine.

    PubMed

    Wu, Alan H B; Christenson, Robert H; Greene, Dina N; Jaffe, Allan S; Kavsak, Peter A; Ordonez-Llanos, Jordi; Apple, Fred S

    2018-04-01

    This document is an essential companion to the third iteration of the National Academy of Clinical Biochemistry [NACB, 8 now the American Association for Clinical Chemistry (AACC) Academy] Laboratory Medicine Practice Guidelines (LMPG) on cardiac markers. The expert consensus recommendations were drafted in collaboration with the International Federation of Clinical Chemistry and Laboratory Medicine Task Force on Clinical Applications of Bio-Markers (IFCC TF-CB). We determined that there is sufficient clinical guidance on the use of cardiac troponin (cTn) testing from clinical practice groups. Thus, in this expert consensus document, we focused on clinical laboratory practice recommendations for high-sensitivity (hs)-cTn assays. This document utilized the expert opinion class of evidence to focus on the following 10 topics: ( a ) quality control (QC) utilization, ( b ) validation of the lower reportable analytical limits, ( c ) units to be used in reporting measurable concentrations for patients and QC materials, ( d ) 99th percentile sex-specific upper reference limits to define the reference interval; ( e ) criteria required to define hs-cTn assays, ( f ) communication with clinicians and the laboratory's role in educating clinicians regarding the influence of preanalytic and analytic problems that can confound assay results, ( g ) studies on hs-cTn assays and how authors need to document preanalytical and analytical variables, ( h ) harmonizing and standardizing assay results and the role of commutable materials, ( i ) time to reporting of results from sample receipt and sample collection, and ( j ) changes in hs-cTn concentrations over time and the role of both analytical and biological variabilities in interpreting results of serial blood collections. © 2017 American Association for Clinical Chemistry.

  15. Compositional Analysis of Martian Soil: Synergism of APEX and MECA Experiments on MPS 2001

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Marshall, J.

    1999-01-01

    The APEX (ATHENA Precursor Experiment) payload for the Mars 2001 mission will analyze soil and dust with a multispectral panoramic imager and an emission spectrometer on a mast on the lander, a Moessbauer spectrometer on the lander robotic arm (RA), and APXS measurements on the Marie Curie rover. These analytical methods will provide data on elemental abundances and mineralogy. The MECA payload on the lander will apply microscopy, AFM, wet chemistry, adhesive substrates, and electrometry to determine the shape and size of particles in the soil and dust, the presence of toxic substances, and electrostatic, magnetic, and hardness qualities of particles. The two experiments will complement one another through several interactions: (1) The panoramic imager provides the geological setting in which both APEX and MECA samples are acquired, (2) The RA provides samples to MECA from the surface and subsurface and will permit APEX analytical tools access to materials below the immediate surface, (3) Comparisons can be made between elemental analyses of the Moessbauer, IR, APXS on APEX and the wet chemistry of MECA which will define trace elements (ionic species in solution) and soil redox potential and conductivity. (4) APEX bulk compositional measurements will place MECA trace measurements in context, and similarly, MECA microscopy will provide particle size data that may correlate with compositional differences determined by the APEX instruments. Additionally, lithic fragments viewed by the MECA microscope station should correlate with mineral/rock species inferred by APEX data, (5) If APEX instruments detect quartz for example, the scratch plates of the MECA microscope stage will define if a mineral of this hardness is registered during abrasion tests. This is by no means an exhaustive list of potential interactions, but it is clear that both the sheer number of analytical techniques and their complementarity should provide an analytically powerful capability for both planetary and HEDS communities.

  16. Compositional Analysis of Martian Soil: Synergism of APEX and MECA Experiments on MPS 2001

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Marshall, J.

    1999-01-01

    The APEX (ATHENA Precursor Experiment) payload for the Mars 2001 mission will analyze soil and dust with a multispectral panoramic imager and an emission spectrometer on a mast on the lander, a Moessbauer spectrometer on the lander robotic arm (RA), and APXS measurements on the Marie Curie rover. These analytical methods will provide data on elemental abundances and mineralogy. The MECA payload on the lander will apply microscopy, AFM, wet chemistry, adhesive substrates, and electrometry to determine the shape and size of particles in the soil and dust, the presence of toxic substances, and electrostatic, magnetic, and hardness qualities of particles. The two experiments will complement one another through several interactions: (1) The panoramic imager provides the geological setting in which both APEX and MECA samples are acquired, (2) The RA provides samples to MECA from the surface and subsurface and will permit APEX analytical tools access to materials below the inunediate surface, (3) Comparisons can be made between elemental analyses of the Moessbauer, IR, APXS on APEX and the wet chemistry of MECA which will define trace elements (ionic species in solution) and soil redox potential and conductivity. (4) APEX bulk compositional measurements will place MECA trace measurements in context, and similarly, MECA microscopy will provide particle size data that may correlate with compositional differences determined by the APEX instruments. Additionally, lithic fragments viewed by the NMCA microscope station should correlate with mineral/rock species inferred by APEX data, (5) If APEX instruments detect quartz for example, the scratch plates of the N4ECA microscope stage will define if a mineral of this hardness is registered during abrasion tests. This is by no means an exhaustive list of potential interactions, but it is clear that both the sheer number of analytical techniques and their complementarity should provide an analytically powerful capability for both planetary and BEDS communities.

  17. Proteomic analysis of serum and sputum analytes distinguishes controlled and poorly controlled asthmatics.

    PubMed

    Kasaian, M T; Lee, J; Brennan, A; Danto, S I; Black, K E; Fitz, L; Dixon, A E

    2018-04-17

    A major goal of asthma therapy is to achieve disease control, with maintenance of lung function, reduced need for rescue medication, and prevention of exacerbation. Despite current standard of care, up to 70% of patients with asthma remain poorly controlled. Analysis of serum and sputum biomarkers could offer insights into parameters associated with poor asthma control. To identify signatures as determinants of asthma disease control, we performed proteomics using Olink proximity extension analysis. Up to 3 longitudinal serum samples were collected from 23 controlled and 25 poorly controlled asthmatics. Nine of the controlled and 8 of the poorly controlled subjects also provided 2 longitudinal sputum samples. The study included an additional cohort of 9 subjects whose serum was collected within 48 hours of asthma exacerbation. Two separate pre-defined Proseek Multiplex panels (INF and CVDIII) were run to quantify 181 separate protein analytes in serum and sputum. Panels consisting of 9 markers in serum (CCL19, CCL25, CDCP1, CCL11, FGF21, FGF23, Flt3L, IL-10Rβ, IL-6) and 16 markers in sputum (tPA, KLK6, RETN, ADA, MMP9, Chit1, GRN, PGLYRP1, MPO, HGF, PRTN3, DNER, PI3, Chi3L1, AZU1, and OPG) distinguished controlled and poorly controlled asthmatics. The sputum analytes were consistent with a pattern of neutrophil activation associated with poor asthma control. The serum analyte profile of the exacerbation cohort resembled that of the controlled group rather than that of the poorly controlled asthmatics, possibly reflecting a therapeutic response to systemic corticosteroids. Proteomic profiles in serum and sputum distinguished controlled and poorly controlled asthmatics, and were maintained over time. Findings support a link between sputum neutrophil markers and loss of asthma control. © 2018 John Wiley & Sons Ltd.

  18. Reference materials for cellular therapeutics.

    PubMed

    Bravery, Christopher A; French, Anna

    2014-09-01

    The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  19. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--QA ANALYTICAL RESULTS FOR PARTICULATE MATTER IN BLANK SAMPLES

    EPA Science Inventory

    The Particulate Matter in Blank Samples data set contains the analytical results for measurements of two particle sizes in 12 samples. Filters were pre-weighed, loaded into impactors, kept unexposed in the laboratory, unloaded and post-weighed. Positive weight gains for laborat...

  20. The development and validation of a health-related quality of life questionnaire for pre-school children with a chronic heart disease.

    PubMed

    Niemitz, M; Seitz, D C M; Oebels, M; Schranz, D; Hövels-Gürich, H; Hofbeck, M; Kaulitz, R; Galm, C; Berger, F; Nagdymann, N; Stiller, B; Borth-Bruhns, T; Konzag, I; Balmer, C; Goldbeck, L

    2013-12-01

    Heart diseases are often associated with residual injuries, persisting functional restrictions, and long-term sequelae for psychosocial development. Currently, there are no disease-specific instruments to assess the health-related quality of life (HrQoL) of pre-school children. The aims of this study were to develop a parent proxy instrument to measure the HrQoL of children aged 3-7 years with a heart disease and to confirm its validity and reliability. Items from the Preschool Pediatric Cardiac Quality of Life Inventory (P-PCQLI) were generated through focus groups of caregivers. In a pilot study, comprehensibility and feasibility were tested. Five subdimensions were defined theoretically. Psychometric properties were analysed within a multicentre study with 167 parental caregivers. The final 52-item instrument contains a total score covering five moderately inter-correlated dimensions. The total score of the questionnaire showed a very high internal consistency (Cronbachs' α = 0.95). Test-retest correlation was at r tt = 0.96. External validity was indicated by higher correlations (r = 0.24-0.68) with a generic paediatric quality of life questionnaire (KINDL) compared to the Strengths and Difficulties Questionnaire (r = 0.17 to 0.59). Low P-PCQLI total scores were significantly associated with inpatient as opposed to outpatient treatment (t = 6.04, p < .001), with at least moderate disease severity ((t = 5.05, p < .001) NYHA classification) and with poorer prognosis (t = 5.53, p < .001) as estimated by the physician. The P-PCQLI is reliable and valid for pre-school children with a heart disease. It could be used as a screening instrument in routine care, and for evaluation of HrQoL outcomes in clinical trials and intervention research.

  1. Including robustness in multi-criteria optimization for intensity-modulated proton therapy

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Unkelbach, Jan; Trofimov, Alexei; Madden, Thomas; Kooy, Hanne; Bortfeld, Thomas; Craft, David

    2012-02-01

    We present a method to include robustness in a multi-criteria optimization (MCO) framework for intensity-modulated proton therapy (IMPT). The approach allows one to simultaneously explore the trade-off between different objectives as well as the trade-off between robustness and nominal plan quality. In MCO, a database of plans each emphasizing different treatment planning objectives, is pre-computed to approximate the Pareto surface. An IMPT treatment plan that strikes the best balance between the different objectives can be selected by navigating on the Pareto surface. In our approach, robustness is integrated into MCO by adding robustified objectives and constraints to the MCO problem. Uncertainties (or errors) of the robust problem are modeled by pre-calculated dose-influence matrices for a nominal scenario and a number of pre-defined error scenarios (shifted patient positions, proton beam undershoot and overshoot). Objectives and constraints can be defined for the nominal scenario, thus characterizing nominal plan quality. A robustified objective represents the worst objective function value that can be realized for any of the error scenarios and thus provides a measure of plan robustness. The optimization method is based on a linear projection solver and is capable of handling large problem sizes resulting from a fine dose grid resolution, many scenarios, and a large number of proton pencil beams. A base-of-skull case is used to demonstrate the robust optimization method. It is demonstrated that the robust optimization method reduces the sensitivity of the treatment plan to setup and range errors to a degree that is not achieved by a safety margin approach. A chordoma case is analyzed in more detail to demonstrate the involved trade-offs between target underdose and brainstem sparing as well as robustness and nominal plan quality. The latter illustrates the advantage of MCO in the context of robust planning. For all cases examined, the robust optimization for each Pareto optimal plan takes less than 5 min on a standard computer, making a computationally friendly interface possible to the planner. In conclusion, the uncertainty pertinent to the IMPT procedure can be reduced during treatment planning by optimizing plans that emphasize different treatment objectives, including robustness, and then interactively seeking for a most-preferred one from the solution Pareto surface.

  2. Student Writing Accepted as High-Quality Responses to Analytic Text-Based Writing Tasks

    ERIC Educational Resources Information Center

    Wang, Elaine; Matsumura, Lindsay Clare; Correnti, Richard

    2018-01-01

    Literacy standards increasingly emphasize the importance of analytic text-based writing. Little consensus exists, however, around what high-quality student responses should look like in this genre. In this study, we investigated fifth-grade students' writing in response to analytic text-based writing tasks (15 teachers, 44 writing tasks, 88 pieces…

  3. EFFECT OF STRUCTURED PHYSICAL ACTIVITY ON SLEEP-WAKE BEHAVIORS IN SEDENTARY ELDERS WITH MOBILITY LIMITATIONS

    PubMed Central

    Vaz Fragoso, Carlos A.; Miller, Michael E.; King, Abby C.; Kritchevsky, Stephen B.; Liu, Christine K.; Myers, Valerie H.; Nadkarni, Neelesh K.; Pahor, Marco; Spring, Bonnie J.; Gill, Thomas M.

    2016-01-01

    OBJECTIVE To evaluate the effect of structured physical activity on sleep-wake behaviors in sedentary community-dwelling elders with mobility limitations. DESIGN Multicenter, randomized trial of moderate-intensity physical activity versus health education, with sleep-wake behaviors pre-specified as a tertiary outcome over a planned intervention period ranging between 24 and 30 months. SETTING Lifestyle Interventions and Independence in Elder (LIFE) Study. PARTICIPANTS 1635 community-dwelling persons, aged 70–89 years, who were initially sedentary with a Short Physical Performance Battery score <10. MEASUREMENTS Sleep-wake behaviors were evaluated by the Insomnia Severity Index (ISI) (≥8 defined insomnia), Epworth Sleepiness Scale (ESS) (≥10 defined daytime drowsiness), and Pittsburgh Sleep Quality Index (PSQI) (> 5 defined poor sleep quality) — administered at baseline and subsequently at 6, 18, and 30 months. RESULTS The randomized groups were similar on baseline demographic variables, including mean age (79 years) and sex (67% female). Relative to health education, structured physical activity significantly reduced the likelihood of having poor sleep quality (adjusted odds ratios [adjOR] for PSQI >5 of 0.80 [0.68, 0.94]), including a reduction in new cases (adjOR for PSQI >5 of 0.70 [0.54, 0.89]) but not in resolution of prevalent cases (adjOR for PSQI ≤5 of 1.13 [0.90, 1.43]). No significant intervention effects were observed for ISI or ESS. CONCLUSION Structured physical activity reduced the likelihood of developing poor sleep quality (PSQI >5) over the intervention period, when compared with health education, but had no effect on prevalent cases of poor sleep quality, or on sleep-wake behaviors evaluated by the ISI or ESS. These results suggest that the benefit of physical activity in this sample was preventive and limited to sleep-wake behaviors evaluated by the PSQI. PMID:26115386

  4. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  5. Using Learning Analytics to Enhance Student Learning in Online Courses Based on Quality Matters Standards

    ERIC Educational Resources Information Center

    Martin, Florence; Ndoye, Abdou; Wilkins, Patricia

    2016-01-01

    Quality Matters is recognized as a rigorous set of standards that guide the designer or instructor to design quality online courses. We explore how Quality Matters standards guide the identification and analysis of learning analytics data to monitor and improve online learning. Descriptive data were collected for frequency of use, time spent, and…

  6. A note on φ-analytic conformal vector fields

    NASA Astrophysics Data System (ADS)

    Deshmukh, Sharief; Bin Turki, Nasser

    2017-09-01

    Taking clue from the analytic vector fields on a complex manifold, φ-analytic conformal vector fields are defined on a Riemannian manifold (Deshmukh and Al-Solamy in Colloq. Math. 112(1):157-161, 2008). In this paper, we use φ-analytic conformal vector fields to find new characterizations of the n-sphere Sn(c) and the Euclidean space (Rn,<,> ).

  7. Measuring cross-cultural patient safety: identifying barriers and developing performance indicators.

    PubMed

    Walker, Roger; St Pierre-Hansen, Natalie; Cromarty, Helen; Kelly, Len; Minty, Bryanne

    2010-01-01

    Medical errors and cultural errors threaten patient safety. We know that access to care, quality of care and clinical safety are all impacted by cultural issues. Numerous approaches to describing cultural barriers to patient safety have been developed, but these taxonomies do not provide a useful set of tools for defining the nature of the problem and consequently do not establish a sound base for problem solving. The Sioux Lookout Meno Ya Win Health Centre has implemented a cross-cultural patient safety (CCPS) model (Walker 2009). We developed an analytical CCPS framework within the organization, and in this article, we detail the validation process for our framework by way of a literature review and surveys of local and international healthcare professionals. We reinforce the position that while cultural competency may be defined by the service provider, cultural safety is defined by the client. In addition, we document the difficulties surrounding the measurement of cultural competence in terms of patient outcomes, which is an underdeveloped dimension of the field of patient safety. We continue to explore the correlation between organizational performance and measurable patient outcomes.

  8. A Log Logistic Survival Model Applied to Hypobaric Decompression Sickness

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny

    2001-01-01

    Decompression sickness (DCS) is a complex, multivariable problem. A mathematical description or model of the likelihood of DCS requires a large amount of quality research data, ideas on how to define a decompression dose using physical and physiological variables, and an appropriate analytical approach. It also requires a high-performance computer with specialized software. I have used published DCS data to develop my decompression doses, which are variants of equilibrium expressions for evolved gas plus other explanatory variables. My analytical approach is survival analysis, where the time of DCS occurrence is modeled. My conclusions can be applied to simple hypobaric decompressions - ascents lasting from 5 to 30 minutes - and, after minutes to hours, to denitrogenation (prebreathing). They are also applicable to long or short exposures, and can be used whether the sufferer of DCS is at rest or exercising at altitude. Ultimately I would like my models to be applied to astronauts to reduce the risk of DCS during spacewalks, as well as to future spaceflight crews on the Moon and Mars.

  9. The net fractional depth dose: a basis for a unified analytical description of FDD, TAR, TMR, and TPR.

    PubMed

    van de Geijn, J; Fraass, B A

    1984-01-01

    The net fractional depth dose (NFD) is defined as the fractional depth dose (FDD) corrected for inverse square law. Analysis of its behavior as a function of depth, field size, and source-surface distance has led to an analytical description with only seven model parameters related to straightforward physical properties. The determination of the characteristic parameter values requires only seven experimentally determined FDDs. The validity of the description has been tested for beam qualities ranging from 60Co gamma rays to 18-MV x rays, using published data from several different sources as well as locally measured data sets. The small number of model parameters is attractive for computer or hand-held calculator applications. The small amount of required measured data is important in view of practical data acquisition for implementation of a computer-based dose calculation system. The generating function allows easy and accurate generation of FDD, tissue-air ratio, tissue-maximum ratio, and tissue-phantom ratio tables.

  10. Net fractional depth dose: a basis for a unified analytical description of FDD, TAR, TMR, and TPR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van de Geijn, J.; Fraass, B.A.

    The net fractional depth dose (NFD) is defined as the fractional depth dose (FDD) corrected for inverse square law. Analysis of its behavior as a function of depth, field size, and source-surface distance has led to an analytical description with only seven model parameters related to straightforward physical properties. The determination of the characteristic parameter values requires only seven experimentally determined FDDs. The validity of the description has been tested for beam qualities ranging from /sup 60/Co gamma rays to 18-MV x rays, using published data from several different sources as well as locally measured data sets. The small numbermore » of model parameters is attractive for computer or hand-held calculator applications. The small amount of required measured data is important in view of practical data acquisition for implementation of a computer-based dose calculation system. The generating function allows easy and accurate generation of FDD, tissue-air ratio, tissue-maximum ratio, and tissue-phantom ratio tables.« less

  11. Concentrations of tritium and strontium-90 in water from selected wells at the Idaho National Engineering Laboratory after purging one, two, and three borehole volumes

    USGS Publications Warehouse

    Bartholomay, R.C.

    1993-01-01

    Water from 11 wells completed in the Snake River Plain aquifer at the Idaho National Engineering Laboratory was sampled as part of the U.S. Geological Survey's quality assurance program to determine the effect of purging different borehole volumes on tritium and strontium-90 concentrations. Wells were selected for sampling on the basis of the length of time it took to purge a borehole volume of water. Samples were collected after purging one, two, and three borehole volumes. The U.S. Department of Energy's Radiological and Environmental Sciences Laboratory provided analytical services. Statistics were used to determine the reproducibility of analytical results. The comparison between tritium and strontium-90 concentrations after purging one and three borehole volumes and two and three borehole volumes showed that all but two sample pairs with defined numbers were in statistical agreement. Results indicate that concentrations of tritium and strontium-90 are not affected measurably by the number of borehole volumes purged.

  12. Fast and global authenticity screening of honey using ¹H-NMR profiling.

    PubMed

    Spiteri, Marc; Jamin, Eric; Thomas, Freddy; Rebours, Agathe; Lees, Michèle; Rogers, Karyne M; Rutledge, Douglas N

    2015-12-15

    An innovative analytical approach was developed to tackle the most common adulterations and quality deviations in honey. Using proton-NMR profiling coupled to suitable quantification procedures and statistical models, analytical criteria were defined to check the authenticity of both mono- and multi-floral honey. The reference data set used was a worldwide collection of more than 800 honeys, covering most of the economically significant botanical and geographical origins. Typical plant nectar markers can be used to check monofloral honey labeling. Spectral patterns and natural variability were established for multifloral honeys, and marker signals for sugar syrups were identified by statistical comparison with a commercial dataset of ca. 200 honeys. Although the results are qualitative, spiking experiments have confirmed the ability of the method to detect sugar addition down to 10% levels in favorable cases. Within the same NMR experiments, quantification of glucose, fructose, sucrose and 5-HMF (regulated parameters) was performed. Finally markers showing the onset of fermentation are described. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Identifying and Coordinating Care for Complex Patients

    PubMed Central

    Rudin, Robert S.; Gidengil, Courtney A.; Predmore, Zachary; Schneider, Eric C.; Sorace, James; Hornstein, Rachel

    2017-01-01

    Abstract In the United States, a relatively small proportion of complex patients---defined as having multiple comorbidities, high risk for poor outcomes, and high cost---incur most of the nation's health care costs. Improved care coordination and management of complex patients could reduce costs while increasing quality of care. However, care coordination efforts face multiple challenges, such as segmenting populations of complex patients to better match their needs with the design of specific interventions, understanding how to reduce spending, and integrating care coordination programs into providers' care delivery processes. Innovative uses of analytics and health information technology (HIT) may address these challenges. Rudin and colleagues at RAND completed a literature review and held discussions with subject matter experts, reaching the conclusion that analytics and HIT are being used in innovative ways to coordinate care for complex patients but that the capabilities are limited, evidence of their effectiveness is lacking, and challenges are substantial, and important foundational work is still needed. PMID:28845354

  14. Tandem mass spectrometry data quality assessment by self-convolution.

    PubMed

    Choo, Keng Wah; Tham, Wai Mun

    2007-09-20

    Many algorithms have been developed for deciphering the tandem mass spectrometry (MS) data sets. They can be essentially clustered into two classes. The first performs searches on theoretical mass spectrum database, while the second based itself on de novo sequencing from raw mass spectrometry data. It was noted that the quality of mass spectra affects significantly the protein identification processes in both instances. This prompted the authors to explore ways to measure the quality of MS data sets before subjecting them to the protein identification algorithms, thus allowing for more meaningful searches and increased confidence level of proteins identified. The proposed method measures the qualities of MS data sets based on the symmetric property of b- and y-ion peaks present in a MS spectrum. Self-convolution on MS data and its time-reversal copy was employed. Due to the symmetric nature of b-ions and y-ions peaks, the self-convolution result of a good spectrum would produce a highest mid point intensity peak. To reduce processing time, self-convolution was achieved using Fast Fourier Transform and its inverse transform, followed by the removal of the "DC" (Direct Current) component and the normalisation of the data set. The quality score was defined as the ratio of the intensity at the mid point to the remaining peaks of the convolution result. The method was validated using both theoretical mass spectra, with various permutations, and several real MS data sets. The results were encouraging, revealing a high percentage of positive prediction rates for spectra with good quality scores. We have demonstrated in this work a method for determining the quality of tandem MS data set. By pre-determining the quality of tandem MS data before subjecting them to protein identification algorithms, spurious protein predictions due to poor tandem MS data are avoided, giving scientists greater confidence in the predicted results. We conclude that the algorithm performs well and could potentially be used as a pre-processing for all mass spectrometry based protein identification tools.

  15. Pre-Osteoarthritis

    PubMed Central

    Brittberg, Mats; Eriksson, Karl; Jurvelin, Jukka S.; Lindahl, Anders; Marlovits, Stefan; Möller, Per; Richardson, James B.; Steinwachs, Matthias; Zenobi-Wong, Marcy

    2015-01-01

    Objective An attempt to define pre-osteoarthritis (OA) versus early OA and definitive osteoarthritis. Methods A group of specialists in the field of cartilage science and treatment was formed to consider the nature of OA onset and its possible diagnosis. Results Late-stage OA, necessitating total joint replacement, is the end stage of a biological process, with many previous earlier stages. Early-stage OA has been defined and involves structural changes identified by arthroscopy or radiography. The group argued that before the “early-stage OA” there must exist a stage where cellular processes, due to the presence of risk factors, have kicked into action but have not yet resulted in structural changes. The group suggested that this stage could be called “pre-osteoarthritis” (pre-OA). Conclusions The group suggests that defining points of initiation for OA in the knee could be defined, for example, by traumatic episodes or surgical meniscectomy. Such events may set in motion metabolic processes that could be diagnosed by modern MRI protocols or arthroscopy including probing techniques before structural changes of early OA have developed. Preventive measures should preferably be applied at this pre-OA stage in order to stop the projected OA “epidemic.” PMID:26175861

  16. The Cost of High-Quality Pre-School Education in New Jersey

    ERIC Educational Resources Information Center

    Belfield, Clive; Schwartz, Heather

    2007-01-01

    This report calculates the full cost of providing well-planned, high quality pre-school for children in New Jersey, as required under "Abbott vs. Burke" (153 NJ 480 1998). The evidence on how high-quality pre-school improves the academic performance of children is compelling. After a rapid expansion over the last decade, many children in…

  17. Accuracy of Blood Pressure-to-Height Ratio to Define Elevated Blood Pressure in Children and Adolescents: The CASPIAN-IV Study.

    PubMed

    Kelishadi, Roya; Bahreynian, Maryam; Heshmat, Ramin; Motlagh, Mohammad Esmail; Djalalinia, Shirin; Naji, Fatemeh; Ardalan, Gelayol; Asayesh, Hamid; Qorbani, Mostafa

    2016-02-01

    The aim of this study was to propose a simple practical diagnostic criterion for pre-hypertension (pre-HTN) and hypertension (HTN) in the pediatric age group. This study was conducted on a nationally representative sample of 14,880 students, aged 6-18 years. HTN and pre-HTN were defined as systolic blood pressure (SBP) and/or diastolic blood pressure (DBP) ≥ 95 and 90-95th percentile for age, gender, and height, respectively. By using the area under the curve (AUC) of the receiver operator characteristic curves, we estimated the diagnostic accuracy of two indexes of SBP-to-height ratio (SBPHR) and DBP-to-height (DBPHR) to define pre-HTN and HTN. Overall, SBPHR performed relatively well in classifying subjects to HTN (AUC 0.80-0.85) and pre-HTN (AUC 0.84-0.90). Likewise, DBPHR performed relatively well in classifying subjects to HTN (AUC 0.90-0.97) and pre-HTN (AUC 0.70-0.83). Two indexes of SBPHR and DBPHR are considered as valid, simple, inexpensive, and accurate tools to diagnose pre-HTN and HTN in pediatric age group.

  18. Simultaneous determination of 1- and 2-naphthol in human urine using on-line clean-up column-switching liquid chromatography-fluorescence detection.

    PubMed

    Preuss, Ralf; Angerer, Jürgen

    2004-03-05

    We developed a new 3-D HPLC method for on-line clean-up and simultaneous quantification of two important naphthalene metabolites, 1-naphthol and 2-naphthol, in human urine. Except an enzymatic hydrolysis no further sample pre-treatment is necessary. The metabolites are stripped from urinary matrix by on-line extraction on a restricted access material pre-column (RAM RP-8), transferred in backflush mode onto a silica-based CN-(cyano)phase column for further purification from interfering substances. By another successive column switching step both analytes are transferred with a minimum of overlapping interferences onto a C12 bonded reversed phase column with trimethylsilyl endcapping where the final separation is carried out. The entire arrangement is software controlled. Eluting analytes are quantified by fluorescence detection (227/430 nm) after an external calibration. Within a total run time of 40 min we can selectively quantify both naphthols with detection limits in the lower ppb range (1.5 and 0.5 microg/l for 1- and 2-naphthol, respectively) with excellent reliability (ensured by precision, accuracy, matrix-independency and FIOH quality assurance program participation). First results on a collective of 53 occupationally non exposed subjects showed mean levels of 11.0 microg/l (1-naphthol) and 12.9 microg/l (2-naphthol). Among smokers (n=21) a significantly elevated mean level of urinary naphthols was determined (1-naphthol: 19.2 microg/l and 2-naphthol: 23.7 microg/l) in comparison to non smokers (n=32; 1-naphthol: 5.6 microg/l, 2-naphthol: 5.6 microg/l).

  19. Implementation and implication of total quality management on client- contractor relationship in residential projects

    NASA Astrophysics Data System (ADS)

    Murali, Swetha; Ponmalar, V.

    2017-07-01

    To make innovation and continuous improvement as a norm, some traditional practices must become unlearnt. Change for growth and competitiveness are required for sustainability for any profitable business such as the construction industry. The leading companies are willing to implement Total Quality Management (TQM) principles, to realise potential advantages and improve growth and efficiency. Ironically, researches recollected quality as the most significant provider for competitive advantage in industrial leadership. The two objectives of this paper are 1) Identify TQM effectiveness in residential projects and 2) Identify the client satisfaction/dissatisfaction areas using Analytical Hierarchy Process (AHP) and suggest effective mitigate measures. Using statistical survey techniques like set of questionnaire survey, it is observed that total quality management was applied in some leading successful organization to an extent. The main attributes for quality achievement can be defined as teamwork and better communication with single agreed goal between client and contractor. Onsite safety is a paramount attribute in the identifying quality within the residential projects. It was noticed that the process based quality methods such as onsite safe working condition; safe management system and modern engineering process safety controls etc. as interlinked functions. Training and effective communication with all stakeholders on quality management principles is essential for effective quality work. Late Only through effective TQM principles companies can avoid some contract litigations with an increased client satisfaction Index.

  20. Sarcopenia during neoadjuvant therapy for oesophageal cancer: characterising the impact on muscle strength and physical performance.

    PubMed

    Guinan, Emer M; Doyle, S L; Bennett, A E; O'Neill, L; Gannon, J; Elliott, J A; O'Sullivan, J; Reynolds, J V; Hussey, J

    2018-05-01

    Preoperative chemo(radio)therapy for oesophageal cancer (OC) may have an attritional impact on body composition and functional status, impacting postoperative outcome. Physical decline with skeletal muscle loss has not been previously characterised in OC and may be amenable to physical rehabilitation. This study characterises skeletal muscle mass and physical performance from diagnosis to post-neoadjuvant therapy in patients undergoing preoperative chemo(radio)therapy for OC. Measures of body composition (axial computerised tomography), muscle strength (handgrip), functional capacity (walking distance), anthropometry (weight, height and waist circumference), physical activity, quality-of-life and nutritional status were captured prospectively. Sarcopenia status was defined as pre-sarcopenic (low muscle mass only), sarcopenic (low muscle mass and low muscle strength or function) or severely sarcopenic (low muscle mass and low muscle strength and low muscle function). Twenty-eight participants were studied at both time points (mean age 62.86 ± 8.18 years, n = 23 male). Lean body mass reduced by 4.9 (95% confidence interval 3.2 to 6.7) kg and mean grip strength reduced by 4.3 (2.5 to 6.1) kg from pre- to post-neoadjuvant therapy. Quality-of-life scores capturing gastrointestinal symptoms improved. Measures of anthropometry, walking distance, physical activity and nutritional status did not change. There was an increase in sarcopenic status from diagnosis (pre-sarcopenic n = 2) to post-treatment (pre-sarcopenic n = 5, severely sarcopenic n = 1). Despite maintenance of body weight, functional capacity and activity habits, participants experience declines in muscle mass and strength. Interventions involving exercise and/or nutritional support to build muscle mass and strength during preoperative therapy, even in patients who are functioning normally, are warranted.

  1. The quality of veterinary in-clinic and reference laboratory biochemical testing.

    PubMed

    Rishniw, Mark; Pion, Paul D; Maher, Tammy

    2012-03-01

    Although evaluation of biochemical analytes in blood is common in veterinary practice, studies assessing the global quality of veterinary in-clinic and reference laboratory testing have not been reported. The aim of this study was to assess the quality of biochemical testing in veterinary laboratories using results obtained from analyses of 3 levels of assayed quality control materials over 5 days. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index to determine factors contributing to poor performance, and agreement between in-clinic and reference laboratory mean results. The suitability of in-clinic and reference laboratory instruments for statistical quality control was determined using adaptations from the computerized program, EZRules3. Reference laboratories were able to achieve desirable quality requirements more frequently than in-clinic laboratories. Across all 3 materials, > 50% of in-clinic analyzers achieved a sigma metric ≥ 6.0 for measurement of 2 analytes, whereas > 50% of reference laboratory analyzers achieved a sigma metric ≥ 6.0 for measurement of 6 analytes. Expanded uncertainty of measurement and ± total allowable error resulted in the highest mean percentages of analytes demonstrating agreement between in-clinic and reference laboratories. Owing to marked variation in bias and coefficient of variation between analyzers of the same and different types, the percentages of analytes suitable for statistical quality control varied widely. These findings reflect the current state-of-the-art with regard to in-clinic and reference laboratory analyzer performance and provide a baseline for future evaluations of the quality of veterinary laboratory testing. © 2012 American Society for Veterinary Clinical Pathology.

  2. Pre-adoptive Factors Predicting Lesbian, Gay, and Heterosexual Couples’ Relationship Quality Across the Transition to Adoptive Parenthood

    PubMed Central

    Goldberg, Abbie E.; Smith, JuliAnna Z.; Kashy, Deborah A.

    2010-01-01

    The current study examined pre-adoptive factors as predictors of relationship quality (love, ambivalence, and conflict) among 125 couples (44 lesbian couples, 30 gay couples, and 51 heterosexual couples) across the first year of adoptive parenthood. On average, all new parents experienced declines in their relationship quality across the first year of parenthood, regardless of sexual orientation, with women experiencing steeper declines in love. Parents who, pre-adoption, reported higher levels of depression, greater use of avoidant coping, lower levels of relationship maintenance behaviors, and less satisfaction with their adoption agencies reported lower relationship quality at the time of the adoption. The effect of avoidant coping on relationship quality varied by gender. Parents who, pre-adoption, reported higher levels of depression, greater use of confrontative coping, and higher levels of relationship maintenance behaviors reported greater declines in relationship quality. These findings have implications for professionals who work with adoptive parents both pre- and post-adoption. PMID:20545395

  3. The value of FDG positron emission tomography/computerised tomography (PET/CT) in pre-operative staging of colorectal cancer: a systematic review and economic evaluation.

    PubMed

    Brush, J; Boyd, K; Chappell, F; Crawford, F; Dozier, M; Fenwick, E; Glanville, J; McIntosh, H; Renehan, A; Weller, D; Dunlop, M

    2011-09-01

    In the UK, colorectal cancer (CRC) is the third most common malignancy (behind lung and breast cancer) with 37,514 cases registered in 2006: around two-thirds (23,384) in the colon and one-third (14,130) in the rectum. Treatment of cancers of the colon can vary considerably, but surgical resection is the mainstay of treatment for curative intent. Following surgical resection, there is a comprehensive assessment of the tumour, it's invasion characteristics and spread (tumour staging). A number of imaging modalities are used in the pre-operative staging of CRCs including; computerised tomography (CT), magnetic resonance imaging, ultrasound imaging and positron emission tomography (PET). This report examines the role of CT in combination with PET scanning (PET/CT 'hybrid' scan). The research objectives are: to evaluate the diagnostic accuracy and therapeutic impact of fluorine-18-deoxyglucose (FDG) PET/CT for the pre-operative staging of primary, recurrent and metastatic cancer using systematic review methods; undertake probabilistic decision-analytic modelling (using Monte Carlo simulation); and conduct a value of information analysis to help inform whether or not there is potential worth in undertaking further research. For each aspect of the research - the systematic review, the handsearch study and the economic evaluation - a database was assembled from a comprehensive search for published and unpublished studies, which included database searches, reference lists search and contact with experts. In the systematic review prospective and retrospective patient series (diagnostic cohort) and randomised controlled trials (RCTs) were eligible for inclusion. Both consecutive series and series that are not explicitly reported as consecutive were included. Two reviewers extracted all data and applied the criteria independently and resolved disagreements by discussion. Data to populate 2 × 2 contingency tables consisting of the number of true positives, true negatives, false positives and false negatives using the studies' own definitions were extracted, as were data relating to changes in management. Fourteen items from the Quality Assessment of Diagnostic Accuracy Studies checklist were used to assess the methodological quality of the included studies. Patient-level data were used to calculate sensitivity and specificity with confidence intervals (CIs). Data were plotted graphically in forest plots. For the economic evaluation, economic models were designed for each of the disease states: primary, recurrent and metastatic. These were developed and populated based on a variety of information sources (in particular from published data sources) and literature, and in consultation with clinical experts. The review found 30 studies that met the eligibility criteria. Only two small studies evaluated the use of FDG PET/CT in primary CRC, and there is insufficient evidence to support its routine use at this time. The use of FDG PET/CT for the detection of recurrent disease identified data from five retrospective studies from which a pooled sensitivity of 91% (95% CI 0.87% to 0.95%) and specificity of 91% (95% CI 0.85% to 0.95%) were observed. Pooled accuracy data from patients undergoing staging for suspected metastatic disease showed FDG PET/CT to have a pooled sensitivity of 91% (95% CI 87% to 94%) and a specificity of 76% (95% CI 58% to 88%), but the poor quality of the studies means the validity of the data may be compromised by several biases. The separate handsearch study did not yield any additional unique studies relevant to FDG PET/CT. Models for recurrent disease demonstrated an incremental cost-effectiveness ratio of £ 21,409 per quality-adjusted life-year (QALY) for rectal cancer, £ 6189 per QALY for colon cancer and £ 21,434 per QALY for metastatic disease. The value of handsearching to identify studies of less clearly defined or reported diagnostic tests is still to be investigated. The systematic review found insufficient evidence to support the routine use of FDG PET/CT in primary CRC and only a small amount of evidence supporting its use in the pre-operative staging of recurrent and metastatic CRC, and, although FDG PET/CT was shown to change patient management, the data are divergent and the quality of research is generally poor. The handsearch to identify studies of less clearly defined or reported diagnostic tests did not find additional studies. The primary limitations in the economic evaluations were due to uncertainty and lack of available evidence from the systematic reviews for key parameters in each of the five models. In order to address this, a conservative approach was adopted in choosing DTA estimates for the model parameters. Probabilistic analyses were undertaken for each of the models, incorporating wide levels of uncertainty particularly for the DTA estimates. None of the economic models reported cost-savings, but the approach adopted was conservative in order to determine more reliable results given the lack of current information. The economic evaluations conclude that FDG PET/CT as an add-on imaging device is cost-effective in the pre-operative staging of recurrent colon, recurrent rectal and metastatic disease but not in primary colon or rectal cancers. There would be value in undertaking an RCT with a concurrent economic evaluation to evaluate the therapeutic impact and cost-effectiveness of FDG PET/CT compared with conventional imaging (without PET) for the pre-operative staging of recurrent and metastatic CRC.

  4. Improving energy audit process and report outcomes through planning initiatives

    NASA Astrophysics Data System (ADS)

    Sprau Coulter, Tabitha L.

    Energy audits and energy models are an important aspect of the retrofit design process, as they provide project teams with an opportunity to evaluate a facilities current building systems' and energy performance. The information collected during an energy audit is typically used to develop an energy model and an energy audit report that are both used to assist in making decisions about the design and implementation of energy conservation measures in a facility. The current lack of energy auditing standards results in a high degree of variability in energy audit outcomes depending on the individual performing the audit. The research presented is based on the conviction that performing an energy audit and producing a value adding energy model for retrofit buildings can benefit from a revised approach. The research was divided into four phases, with the initial three phases consisting of: 1.) process mapping activity - aimed at reducing variability in the energy auditing and energy modeling process. 2.) survey analysis -- To examine the misalignment between how industry members use the top energy modeling tools compared to their intended use as defined by software representatives. 3.) sensitivity analysis -- analysis of the affect key energy modeling inputs are having on energy modeling analysis results. The initial three phases helped define the need for an improved energy audit approach that better aligns data collection with facility owners' needs and priorities. The initial three phases also assisted in the development of a multi-criteria decision support tool that incorporates a House of Quality approach to guide a pre-audit planning activity. For the fourth and final research phase explored the impacts and evaluation methods of a pre-audit planning activity using two comparative energy audits as case studies. In each case, an energy audit professionals was asked to complete an audit using their traditional methods along with an audit which involved them first participating in a pre-audit planning activity that aligned the owner's priorities with the data collection. A comparative analysis was then used to evaluate the effects of the pre-audit planning activity in developing a more strategic method for collecting data and representing findings in an energy audit report to a facility owner. The case studies demonstrated that pre-audit planning has the potential to improve the efficiency of an energy audit process through reductions in transition time waste. The cases also demonstrated the value of audit report designs that are perceived by owners to be project specific vs. generic. The research demonstrated the ability to influence and alter an auditors' behavior through participating in a pre-audit planning activity. It also shows the potential benefits of using the House of Quality as a method of aligning data collection with owner's goals and priorities to develop reports that have increased value.

  5. Assessment of maternal health care quality: conceptual and methodologic issues.

    PubMed

    Lane, D S; Kelman, H R

    1975-10-01

    Past efforts in assessment of the quality of maternity care have been analyzed in order to develop an evaluation framework that will have utility and applicability beyond a specific program, population, or health discipline. Presently available evaluation approaches have focused attention on either "high risk" populations or upon women experiencing a complicated pregnancy or delivery. Quality has been defined as the extent to which normative or empirically derived standards of obstetrical care have been applied. An alternative approach is suggested which conceives of the pregnancy as a normal physiological event but with the potentiality of either causing or exacerbating social or health problems. Maternity care quality is viewed as the application of those necessary health and health-related services that are required to safeguard the health of the mother and offspring, minimize the noxious consequences of pre-existing or concurrent health hazards or conditions, and upgrade the health and social functioning of those women who require it. Additionally, the system of services should be functionally organized to optimize care. Indicators of quality are suggested which incorporate structural, process, and outcome variables, and which link medical and consumer criteria in a comprehensive community level approach to quality assessment.

  6. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  7. Modulation of Respiratory Frequency by Peptidergic Input to Rhythmogenic Neurons in the PreBötzinger Complex

    PubMed Central

    Gray, Paul A.; Rekling, Jens C.; Bocchiaro, Christopher M.; Feldman, Jack L.

    2010-01-01

    Neurokinin-1 receptor (NK1R) and μ-opioid receptor (μOR) agonists affected respiratory rhythm when injected directly into the preBötzinger Complex (preBötC), the hypothesized site for respiratory rhythmogenesis in mammals. These effects were mediated by actions on preBötC rhythmogenic neurons. The distribution of NK1R+ neurons anatomically defined the preBötC. Type 1 neurons in the preBötC, which have rhythmogenic properties, expressed both NK1Rs and μORs, whereas type 2 neurons expressed only NK1Rs. These findings suggest that the preBötC is a definable anatomic structure with unique physiological function and that a subpopulation of neurons expressing both NK1Rs and μORs generate respiratory rhythm and modulate respiratory frequency. PMID:10567264

  8. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  9. THE COST-EFFECTIVENESS OF ALTERNATIVE INSTRUMENTS FOR ENVIRONMENTAL PROTECTION IN A SECOND-BEST SETTING. (R825313)

    EPA Science Inventory

    Abstract

    This paper employs analytical and numerical general equilibrium models to examine the significance of pre-existing factor taxes for the costs of pollution reduction under a wide range of environmental policy instruments. Pre-existing taxes imply significantly ...

  10. Drone inflight mixing of biochemical samples.

    PubMed

    Katariya, Mayur; Chung, Dwayne Chung Kim; Minife, Tristan; Gupta, Harshit; Zahidi, Alifa Afiah Ahmad; Liew, Oi Wah; Ng, Tuck Wah

    2018-03-15

    Autonomous systems for sample transport to the laboratory for analysis can be improved in terms of timeliness, cost and error mitigation in the pre-analytical testing phase. Drones have been reported for outdoor sample transport but incorporating devices on them to attain homogenous mixing of reagents during flight to enhance sample processing timeliness is limited by payload issues. It is shown here that flipping maneuvers conducted with quadcopters are able to facilitate complete and gentle mixing. This capability incorporated during automated sample transport serves to address an important factor contributing to pre-analytical variability which ultimately impacts on test result reliability. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Developing patient-centred care: an ethnographic study of patient perceptions and influence on quality improvement.

    PubMed

    Renedo, Alicia; Marston, Cicely

    2015-04-23

    Understanding quality improvement from a patient perspective is important for delivering patient-centred care. Yet the ways patients define quality improvement remains unexplored with patients often excluded from improvement work. We examine how patients construct ideas of 'quality improvement' when collaborating with healthcare professionals in improvement work, and how they use these understandings when attempting to improve the quality of their local services. We used in-depth interviews with 23 'patient participants' (patients involved in quality improvement work) and observations in several sites in London as part of a four-year ethnographic study of patient and public involvement (PPI) activities run by Collaborations for Leadership in Applied Health Research and Care for Northwest London. We took an iterative, thematic and discursive analytical approach. When patient participants tried to influence quality improvement or discussed different dimensions of quality improvement their accounts and actions frequently started with talk about improvement as dependent on collective action (e.g. multidisciplinary healthcare professionals and the public), but usually quickly shifted away from that towards a neoliberal discourse emphasising the role of individual patients. Neoliberal ideals about individual responsibility were taken up in their accounts moving them away from the idea of state and healthcare providers being held accountable for upholding patients' rights to quality care, and towards the idea of citizens needing to work on self-improvement. Participants portrayed themselves as governed by self-discipline and personal effort in their PPI work, and in doing so provided examples of how neoliberal appeals for self-regulation and self-determination also permeated their own identity positions. When including patient voices in measuring and defining 'quality', governments and public health practitioners should be aware of how neoliberal rationalities at the heart of policy and services may discourage consumers from claiming rights to quality care by contributing to public unwillingness to challenge the status quo in service provision. If the democratic potential of patient and public involvement initiatives is to be realised, it will be crucial to help citizens to engage critically with how neoliberal rationalities can undermine their abilities to demand quality care.

  12. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  13. Pre-Operative Diet Impacts the Adipose Tissue Response to Surgical Trauma

    PubMed Central

    Nguyen, Binh; Tao, Ming; Yu, Peng; Mauro, Christine; Seidman, Michael A.; Wang, Yaoyu E.; Mitchell, James; Ozaki, C. Keith

    2012-01-01

    Background Short-term changes in pre-operative nutrition can have profound effects on surgery related outcomes such as ischemia reperfusions injury in pre-clinical models. Dietary interventions that lend protection against stress in animal models (e.g. fasting, dietary restriction [DR]) impact adipose tissue quality/quantity. Adipose tissue holds high surgical relevance due to its anatomic location and high tissue volume, and it is ubiquitously traumatized during surgery. Yet the response of adipose tissue to trauma under clinically relevant circumstances including dietary status remains poorly defined. We hypothesized that pre-operative diet alters the adipose tissue response to surgical trauma. Methods A novel mouse model of adipose tissue surgical trauma was employed. Dietary conditions (diet induced obesity [DIO], pre-operative DR) were modulated prior to application of surgical adipose tissue trauma in the context of clinically common scenarios (different ages, simulated bacterial wound contamination). Local/distant adipose tissue phenotypic responses were measured as represented by gene expression of inflammatory, tissue remodeling/growth, and metabolic markers. Results Surgical trauma had a profound effect on adipose tissue phenotype at the site of trauma. Milder but significant distal effects on non-traumatized adipose tissue were also observed. DIO exacerbated the inflammatory aspects of this response, and pre-operative DR tended to reverse these changes. Age and LPS-simulated bacterial contamination also impacted the adipose tissue response to trauma, with young adult animals and LPS treatment exacerbating the proinflammatory response. Conclusions Surgical trauma dramatically impacts both local and distal adipose tissue biology. Short-term pre-operative DR may offer a strategy to attenuate this response. PMID:23274098

  14. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  15. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation. PMID:25867077

  16. Improving laboratory results turnaround time by reducing pre analytical phase.

    PubMed

    Khalifa, Mohamed; Khalid, Parwaiz

    2014-01-01

    Laboratory turnaround time is considered one of the most important indicators of work efficiency in hospitals, physicians always need timely results to take effective clinical decisions especially in the emergency department where these results can guide physicians whether to admit patients to the hospital, discharge them home or do further investigations. A retrospective data analysis study was performed to identify the effects of ER and Lab staff training on new routines for sample collection and transportation on the pre-analytical phase of turnaround time. Renal profile tests requested by the ER and performed in 2013 has been selected as a sample, and data about 7,519 tests were retrieved and analyzed to compare turnaround time intervals before and after implementing new routines. Results showed significant time reduction on "Request to Sample Collection" and "Collection to In Lab Delivery" time intervals with less significant improvement on the analytical phase of the turnaround time.

  17. Hydrogel nanoparticle based immunoassay

    DOEpatents

    Liotta, Lance A; Luchini, Alessandra; Petricoin, Emanuel F; Espina, Virginia

    2015-04-21

    An immunoassay device incorporating porous polymeric capture nanoparticles within either the sample collection vessel or pre-impregnated into a porous substratum within fluid flow path of the analytical device is presented. This incorporation of capture particles within the immunoassay device improves sensitivity while removing the requirement for pre-processing of samples prior to loading the immunoassay device. A preferred embodiment is coreshell bait containing capture nanoparticles which perform three functions in one step, in solution: a) molecular size sieving, b) target analyte sequestration and concentration, and c) protection from degradation. The polymeric matrix of the capture particles may be made of co-polymeric materials having a structural monomer and an affinity monomer, the affinity monomer having properties that attract the analyte to the capture particle. This device is useful for point of care diagnostic assays for biomedical applications and as field deployable assays for environmental, pathogen and chemical or biological threat identification.

  18. Learning About Love: A Meta-Analytic Study of Individually-Oriented Relationship Education Programs for Adolescents and Emerging Adults.

    PubMed

    Simpson, David M; Leonhardt, Nathan D; Hawkins, Alan J

    2018-03-01

    Despite recent policy initiatives and substantial federal funding of individually oriented relationship education programs for youth, there have been no meta-analytic reviews of this growing field. This meta-analytic study draws on 17 control-group studies and 13 one-group/pre-post studies to evaluate the effectiveness of relationship education programs on adolescents' and emerging adults' relationship knowledge, attitudes, and skills. Overall, control-group studies produced a medium effect (d = .36); one-group/pre-post studies also produced a medium effect (d = .47). However, the lack of studies with long-term follow-ups of relationship behaviors in the young adult years is a serious weakness in the field, limiting what we can say about the value of these programs for helping youth achieve their aspirations for healthy romantic relationships and stable marriages.

  19. Research Challenges in Managing and Using Service Level Agreements

    NASA Astrophysics Data System (ADS)

    Rana, Omer; Ziegler, Wolfgang

    A Service Level Agreement (SLA) represents an agreement between a service user and a provider in the context of a particular service provision. SLAs contain Quality of Service properties that must be maintained by a provider, and as agreed between a provider and a user/client. These are generally defined as a set of Service Level Objectives (SLOs). These properties need to be measurable and must be monitored during the provision of the service that has been agreed in the SLA. The SLA must also contain a set of penalty clauses specifying what happens when service providers fail to deliver the pre-agreed quality. Hence, an SLA may be used by both a user and a provider - from a user perspective, an SLA defines what is required - often defined using non-functional attributes of service provision. From a providers perspective, an SLA may be used to support capacity planning - especially if a provider is making it's capability available to multiple users. An SLA may be used by a client and provider to manage their behaviour over time - for instance, to optimise their long running revenue (cost) or QoS attributes (such as execution time), for instance. The lifecycle of an SLA is outlined, along with various uses of SLAs to support infrastructure management. A discussion about WS-Agreement - the emerging standard for specifying SLAs - is also provided.

  20. Blanching, salting and sun drying of different pumpkin fruit slices.

    PubMed

    Workneh, T S; Zinash, A; Woldetsadik, K

    2014-11-01

    The study was aimed at assessing the quality of pumpkin (Cucuribita Spp.) slices that were subjected to pre-drying treatments and drying using two drying methods (uncontrolled sun and oven) fruit accessions. Pre-drying had significant (P ≤ 0.05) effect on the quality of dried pumpkin slices. 10 % salt solution dipped pumpkin fruit slices had good chemical quality. The two-way interaction between drying methods and pre-drying treatments had significant (P ≤ 0.05) effect on chemical qualities. Pumpkin subjected to salt solution dipping treatment and oven dried had higher chemical concentrations. Among the pumpkin fruit accessions, pumpkin accession 8007 had the superior TSS, total sugar and sugar to acid ratio after drying. Among the three pre-drying treatment, salt solution dipping treatment had significant (P ≤ 0.05) effect and the most efficient pre-drying treatment to retain the quality of dried pumpkin fruits without significant chemical quality deterioration. Salt dipping treatment combined with low temperature (60 °C) oven air circulation drying is recommended to maintain quality of dried pumpkin slices. However, since direct sun drying needs extended drying time due to fluctuation in temperature, it is recommended to develop or select best successful solar dryer for use in combination with pre-drying salt dipping or blanching treatments.

  1. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam

    PubMed Central

    2012-01-01

    Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730

  2. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam.

    PubMed

    Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine

    2012-12-29

    The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.

  3. Review on water quality sensors

    NASA Astrophysics Data System (ADS)

    Kruse, Peter

    2018-05-01

    Terrestrial life may be carbon-based, but most of its mass is made up of water. Access to clean water is essential to all aspects of maintaining life. Mainly due to human activity, the strain on the water resources of our planet has increased substantially, requiring action in water management and purification. Water quality sensors are needed in order to quantify the problem and verify the success of remedial actions. This review summarizes the most common chemical water quality parameters, and current developments in sensor technology available to monitor them. Particular emphasis is on technologies that lend themselves to reagent-free, low-maintenance, autonomous and continuous monitoring. Chemiresistors and other electrical sensors are discussed in particular detail, while mechanical, optical and electrochemical sensors also find mentioning. The focus here is on the physics of chemical signal transduction in sensor elements that are in direct contact with the analyte. All other sensing methods, and all other elements of sampling, sample pre-treatment as well as the collection, transmission and analysis of the data are not discussed here. Instead, the goal is to highlight the progress and remaining challenges in the development of sensor materials and designs for an audience of physicists and materials scientists.

  4. Expanding Access to Quality Pre-K Is Sound Public Policy

    ERIC Educational Resources Information Center

    Barnett, W. Steven

    2013-01-01

    In 2013, preschool education received more attention in the media and public policy circles than it has for some time, in part because of a series of high-profile proposals to expand access to quality pre-K. The scientific basis for these proposed expansions of quality pre-K is impressive. This paper brings to bear the full weight of the evidence…

  5. The Ratios of Pre-emulsified Duck Skin for Optimized Processing of Restructured Ham.

    PubMed

    Shim, Jae-Yun; Kim, Tae-Kyung; Kim, Young-Boong; Jeon, Ki-Hong; Ahn, Kwang-Il; Paik, Hyun-Dong; Choi, Yun-Sang

    2018-02-01

    The purpose of this study was to investigate the quality of duck ham formulated with duck skin through the pre-emulsification process. The experiments to investigate the quality characteristics of duck ham were carried out to measure proximate composition, cooking loss, emulsion stability, pH, color, texture profile analysis, apparent viscosity, and sensory characteristics. Duck ham was prepared with various ratios of duck skin in pre-emulsion as follows: Control (duct skin 30%), T1 (duck skin 20% + pre-emulsified duck skin 10%), T2 (duck skin 15% + pre-emulsified duck skin 15%), T3 (duck skin 10% + pre-emulsified duck skin 20%), and T4 (pre-emulsified duck skin 30%). As the ratio of duck skin to pre-emulsified skin changed, the quality of duck ham in terms of moisture content, fat content, cooking loss, emulsion stability, lightness, textural analysis, apparent viscosity, and overall acceptability changed. The moisture content of T2 was the highest ( p <0.05) and that of the control and T4 was the lowest ( p <0.05). The fat content of control was higher than all treatments ( p <0.05). T2 had the lowest values in cooking loss, total expressible fluid, fat separation, hardness, springiness, and gumminess ( p <0.05). The score of overall acceptability of all treatments with pre-emulsified skin was higher than control ( p <0.05). Therefore, the pre-emulsification process can improve the quality characteristics of duck ham and 1:1 ratio of duck skin and pre-emulsified skin was the proper ratio to improve the quality characteristics of duck ham.

  6. Optimizing direct amplification of forensic commercial kits for STR determination.

    PubMed

    Caputo, M; Bobillo, M C; Sala, A; Corach, D

    2017-04-01

    Direct DNA amplification in forensic genotyping reduces analytical time when large sample sets are being analyzed. The amplification success depends mainly upon two factors: on one hand, the PCR chemistry and, on the other, the type of solid substrate where the samples are deposited. We developed a workflow strategy aiming to optimize times and cost when starting from blood samples spotted onto diverse absorbent substrates. A set of 770 blood samples spotted onto Blood cards, Whatman ® 3 MM paper, FTA™ Classic cards, and Whatman ® Grade 1 was analyzed by a unified working strategy including a low-cost pre-treatment, a PCR amplification volume scale-down, and the use of the 3500 Genetic Analyzer as the analytical platform. Samples were analyzed using three different commercial multiplex STR direct amplification kits. The efficiency of the strategy was evidenced by a higher percentage of high-quality profiles obtained (over 94%), a reduced number of re-injections (average 3.2%), and a reduced amplification failure rate (lower than 5%). Average peak height ratio among different commercial kits was 0.91, and the intra-locus balance showed values ranging from 0.92 to 0.94. A comparison with previously reported results was performed demonstrating the efficiency of the proposed modifications. The protocol described herein showed high performance, producing optimal quality profiles, and being both time and cost effective. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  7. Unintended Revelations in History Textbooks: The Precarious Authenticity and Historical Continuity of the Slovak Nation

    ERIC Educational Resources Information Center

    Šulíková, Jana

    2016-01-01

    Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…

  8. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  9. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  10. Experimental and analytical studies on the vibration serviceability of pre-stressed cable RC truss floor systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuhong; Cao, Liang; Chen, Y. Frank; Liu, Jiepeng; Li, Jiang

    2016-01-01

    The developed pre-stressed cable reinforced concrete truss (PCT) floor system is a relatively new floor structure, which can be applied to various long-span structures such as buildings, stadiums, and bridges. Due to the lighter mass and longer span, floor vibration would be a serviceability concern problem for such systems. In this paper, field testing and theoretical analysis for the PCT floor system were conducted. Specifically, heel-drop impact and walking tests were performed on the PCT floor system to capture the dynamic properties including natural frequencies, mode shapes, damping ratios, and acceleration response. The PCT floor system was found to be a low frequency (<10 Hz) and low damping (damping ratio<2 percent) structural system. The comparison of the experimental results with the AISC's limiting values indicates that the investigated PCT system exhibits satisfactory vibration perceptibility, however. The analytical solution obtained from the weighted residual method agrees well with the experimental results and thus validates the proposed analytical expression. Sensitivity studies using the analytical solution were also conducted to investigate the vibration performance of the PCT floor system.

  11. Interferences from blood collection tube components on clinical chemistry assays

    PubMed Central

    Bowen, Raffick A.R.; Remaley, Alan T.

    2014-01-01

    Improper design or use of blood collection devices can adversely affect the accuracy of laboratory test results. Vascular access devices, such as catheters and needles, exert shear forces during blood flow, which creates a predisposition to cell lysis. Components from blood collection tubes, such as stoppers, lubricants, surfactants, and separator gels, can leach into specimens and/or adsorb analytes from a specimen; special tube additives may also alter analyte stability. Because of these interactions with blood specimens, blood collection devices are a potential source of pre-analytical error in laboratory testing. Accurate laboratory testing requires an understanding of the complex interactions between collection devices and blood specimens. Manufacturers, vendors, and clinical laboratorians must consider the pre-analytical challenges in laboratory testing. Although other authors have described the effects of endogenous substances on clinical assay results, the effects/impact of blood collection tube additives and components have not been well systematically described or explained. This review aims to identify and describe blood collection tube additives and their components and the strategies used to minimize their effects on clinical chemistry assays. PMID:24627713

  12. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  13. Inventory and recently increasing GLOF susceptibility of glacial lakes in Sikkim, Eastern Himalaya

    NASA Astrophysics Data System (ADS)

    Aggarwal, Suruchi; Rai, S. C.; Thakur, P. K.; Emmer, Adam

    2017-10-01

    Climatic changes alter the climate system, leading to a decrease of glacier mass volumes and swelling glacial lakes. This study provides a new inventory of glacial and high-altitude lakes for Sikkim, Eastern Himalaya, and evaluates the susceptibility of lakes to Glacial Lake Outburst Flood (GLOF). By using satellite data of high spatial resolution (5 m), we obtain 1104 glacial and high-altitude lakes with total area 30.498 km2, of which 472 have an area > 0.01 km2. Applying pre-defined GLOF susceptibility criteria on these 472 lakes yields 21 lakes susceptible to GLOF, which all increased in area from 1972-2015. Using Analytic Hierarchy Processes (AHP), the pairwise comparison matrix further reveals that 5 of these glacial lakes have low, 14 have medium and 2 have high GLOF susceptibility. Especially these 16 glacial lakes with high and medium GLOF susceptibility may threaten downstream communities and infrastructure and need further attention.

  14. LISA Pathfinder Instrument Data Analysis

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe

    2010-01-01

    LISA Pathfinder (LPF) is an ESA-launched demonstration mission of key technologies required for the joint NASA-ESA gravitational wave observatory in space, LISA. As part of the LPF interferometry investigations, analytic models of noise sources and corresponding noise subtraction techniques have been developed to correct for effects like the coupling of test mass jitter into displacement readout, and fluctuations of the laser frequency or optical pathlength difference. Ground testing of pre-flight hardware of the Optical Metrology subsystem is currently ongoing at the Albert Einstein Institute Hannover. In collaboration with NASA Goddard Space Flight Center, the LPF mission data analysis tool LTPDA is being used to analyze the data product of these tests. Furthermore, the noise subtraction techniques and in-flight experiment runs for noise characterization are being defined as part of the mission experiment master plan. We will present the data analysis outcome of preflight hardware ground tests and possible noise subtraction strategies for in-flight instrument operations.

  15. Thinking through postoperative cognitive dysfunction: How to bridge the gap between clinical and pre-clinical perspectives.

    PubMed

    Hovens, Iris B; Schoemaker, Regien G; van der Zee, Eddy A; Heineman, Erik; Izaks, Gerbrand J; van Leeuwen, Barbara L

    2012-10-01

    Following surgery, patients may experience cognitive decline, which can seriously reduce quality of life. This postoperative cognitive dysfunction (POCD) is mainly seen in the elderly and is thought to be mediated by surgery-induced inflammatory reactions. Clinical studies tend to define POCD as a persisting, generalised decline in cognition, without specifying which cognitive functions are impaired. Pre-clinical research mainly describes early hippocampal dysfunction as a consequence of surgery-induced neuroinflammation. These different approaches to study POCD impede translation between clinical and pre-clinical research outcomes and may hamper the development of appropriate interventions. This article analyses which cognitive domains deteriorate after surgery and which brain areas might be involved. The most important outcomes are: (1) POCD encompasses a wide range of cognitive impairments; (2) POCD affects larger areas of the brain; and (3) individual variation in the vulnerability of neuronal networks to neuroinflammatory mechanisms may determine if and how POCD manifests itself. We argue that, for pre-clinical and clinical research of POCD to advance, the effects of surgery on various cognitive functions and brain areas should be studied. Moreover, in addition to general characteristics, research should take inter-relationships between cognitive complaints and physical and mental characteristics into account. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  17. An empirical review of antimalarial quality field surveys: the importance of characterising outcomes.

    PubMed

    Grech, James; Robertson, James; Thomas, Jackson; Cooper, Gabrielle; Naunton, Mark; Kelly, Tamsin

    2018-01-05

    For decades, thousands of people have been dying from malaria infections because of poor-quality medicines (PQMs). While numerous efforts have been initiated to reduce their presence, PQMs are still risking the lives of those seeking treatment. This review addresses the importance of characterising results of antimalarial medicine field surveys based upon the agreement of clearly defined definitions. Medicines found to be of poor quality can be falsified or counterfeit, substandard or degraded. The distinction between these categories is important as each category requires a different countermeasure. To observe the current trends in the reporting of field surveys, a systematic literature search of six academic databases resulted in the quantitative analysis of 61 full-text journal articles. Information including sample size, sampling method, geographical regions, analytical techniques, and characterisation conclusions was observed for each. The lack of an accepted uniform reporting system has resulted in varying, incomplete reports, which may not include important information that helps form effective countermeasures. The programmes influencing medicine quality such as prequalification, procurement services, awareness and education can be supported with the information derived from characterised results. The implementation of checklists such as the Medicine Quality Assessment Reporting Guidelines will further strengthen the battle against poor-quality antimalarials. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quality in laboratory medicine: 50years on.

    PubMed

    Plebani, Mario

    2017-02-01

    The last 50years have seen substantial changes in the landscape of laboratory medicine: its role in modern medicine is in evolution and the quality of laboratory services is changing. The need to control and improve quality in clinical laboratories has grown hand in hand with the growth in technological developments leading to an impressive reduction of analytical errors over time. An essential cause of this impressive improvement has been the introduction and monitoring of quality indicators (QIs) such as the analytical performance specifications (in particular bias and imprecision) based on well-established goals. The evolving landscape of quality and errors in clinical laboratories moved first from analytical errors to all errors performed within the laboratory walls, subsequently to errors in laboratory medicine (including errors in test requesting and result interpretation), and finally, to a focus on errors more frequently associated with adverse events (laboratory-associated errors). After decades in which clinical laboratories have focused on monitoring and improving internal indicators of analytical quality, efficiency and productivity, it is time to shift toward indicators of total quality, clinical effectiveness and patient outcomes. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. The effect of pre-evaporation on ion distributions in inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Liu, Shulan; Beauchemin, Diane

    2006-02-01

    The connecting tube (2 or 5-mm i. d., 11-cm long) between the spray chamber and the torch was heated (to 400 °C) to investigate the effect of pre-evaporation on the distribution of ions in inductively coupled plasma mass spectrometry (ICP-MS). Axial and radial profiles of analyte ions (Al +, V +, Cr +, Ni +, Zn +, Mn +, Zn +, As +, Se +, Mo +, Cd +, Sb +, La +, Pb +) in 1% HNO 3 as well as some polyatomic ions (LaO +, ArO +, ArN +, CO 2+) were simultaneously obtained on a time-of-flight ICP-MS instrument. Upon heating the connecting tube, the optimal axial position of all elements shifted closer to the load coil. Without the heated tube, 3.5 mm was the compromise axial position for multielemental analysis, which was optimal for 6 analytes. With the heated tube, this position became 1.5 mm, which was then optimal for 9 of the 14 analytes. Furthermore, the radial profiles, which were wide with a plateau in their middle without heating, became significantly narrower and Gaussian-like with a heated tube. This narrowing, which was most important for the 5-mm tube, slightly (by a factor of two at the most) yet significantly (at the 95% confidence level) improved the sensitivity of all elements but Mn upon optimisation of the axial position for compromise multi-element analysis. Furthermore, a concurrent decrease in the standard deviation of the blank was significant at the 95% confidence level for 9 of the 14 analytes. For most of the analytes, this translated into a two-fold to up to an order of magnitude improvement in detection limit, which is commensurate with a reduction of noise resulting from the smaller droplets entering the plasma after traversing the pre-evaporation tube.

  20. White tea intake prevents prediabetes-induced metabolic dysfunctions in testis and epididymis preserving sperm quality.

    PubMed

    Dias, Tânia R; Alves, Marco G; Rato, Luís; Casal, Susana; Silva, Branca M; Oliveira, Pedro F

    2016-11-01

    Prediabetes has been associated with alterations in male reproductive tract, especially in testis and epididymis. Moreover, in vitro studies described a promising action of tea (Camellia sinensis L.) against metabolic dysfunctions. Herein, we hypothesized that white tea (WTEA) ingestion by prediabetic animals could ameliorate the metabolic alterations induced by the disease in testicular and epididymal tissues, preserving sperm quality. WTEA infusion was prepared and its phytochemical profile was evaluated by 1 H-NMR. A streptozotocin-induced prediabetic rat model was developed and three experimental groups were defined: control, prediabetic (PreDM) and prediabetic drinking WTEA (PreDM+WTEA). Metabolic profiles of testis and epididymis were evaluated by determining the metabolites content ( 1 H-NMR), protein levels (western blot) and enzymatic activities of key metabolic intervenient. The quality of spermatozoa from cauda epididymis was also assessed. Prediabetes increased glucose transporter 3 protein levels and decreased lactate dehydrogenase activity in testis, resulting in a lower lactate content. WTEA ingestion led to a metabolic adaptation to restore testicular lactate content. Concerning epididymis, prediabetes decreased the protein levels of several metabolic intervenient, resulting in decreased lactate and alanine content. WTEA consumption restored most of the evidenced alterations, however, not lactate content. WTEA also improved epididymal sperm motility and restored sperm viability. Prediabetes strongly affected testicular and epididymal metabolic status and most of these alterations were restored by WTEA consumption, resulting in the improvement of sperm quality. Our results suggest that WTEA consumption can be a cost-effective strategy to improve prediabetes-induced reproductive dysfunction. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. 1990 National Water Quality Laboratory Services Catalog

    USGS Publications Warehouse

    Pritt, Jeffrey; Jones, Berwyn E.

    1989-01-01

    PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.

  2. 100-OL-1 Operable Unit Pilot Study: XRF Evaluation of Select Pre-Hanford Orchards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunn, Amoret L.; Fritz, Brad G.; Pulsipher, Brent A.

    Prior to the acquisition of land by the U.S. Department of War in February 1943 and the creation of the Hanford Site, the land along the Columbia River was home to over 1000 people. Farming and orchard operations by both homesteaders and commercial organizations were prevalent. Orchard activities and the associated application of lead arsenate pesticide ceased in 1943, when residents were moved from the Hanford Site at the beginning of the Manhattan Project. Today, the residues from historical application of lead arsenate pesticide persist in some locations on the Hanford Site. In 2012, the U.S. Department of Energy, U.S.more » Environmental Protection Agency, and Washington State Department of Ecology established the 100-OL-1 Operable Unit (OU) through the Hanford Federal Facility Agreement and Consent Order, known as the Tri-Party Agreement. The pre-Hanford orchard lands identified as the 100-OL-1 OU are located south of the Columbia River and east of the present-day Vernita Bridge, and extend southeast to the former Hanford townsite. The discontinuous orchard lands within 100-OL-1 OU are approximately 20 km2 (5000 ac). A pilot study was conducted to support the approval of the remedial investigation/feasibility study work plan to evaluate the 100-OL-1 OU. This pilot study evaluated the use of a field portable X-ray fluorescence (XRF) analyzer for evaluating lead and arsenic concentrations on the soil surface as an indicator of lead arsenate pesticide residues in the OU. The objectives of the pilot study included evaluating a field portable XRF analyzer as the analytical method for decision making, estimating the nature and extent of lead and arsenic in surface soils in four decision units, evaluating the results for the purpose of optimizing the sampling approach implemented in the remedial investigation, and collecting information to improve the cost estimate and planning the cultural resources review for sampling activities in the remedial investigation. Based on the results of the pilot study, the recommendations for the revision of the work plan are as follows: • characterize the surface soil using field portable XRF measurements with confirmatory inductively coupled plasma mass spectroscopy sampling for the remedial investigation • establish decision units of similar defined areas • establish a process for field investigation of soil concentrations exceeding the screening criteria at the border of the 100-OL-1 OU • define data quality objectives for the work plan using the results of the pilot study and refining the sampling approach for the remedial investigation.« less

  3. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students

    PubMed Central

    Tay, Shu Wen; Ryan, Paul; Ryan, C Anthony

    2016-01-01

    Background Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs. Methods The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied. Results Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3–13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students. Conclusions Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice. PMID:28344696

  4. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students.

    PubMed

    Tay, Shu Wen; Ryan, Paul; Ryan, C Anthony

    2016-10-01

    Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs. The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied. Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3-13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students. Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice.

  5. Insight and Action Analytics: Three Case Studies to Consider

    ERIC Educational Resources Information Center

    Milliron, Mark David; Malcolm, Laura; Kil, David

    2014-01-01

    Civitas Learning was conceived as a community of practice, bringing together forward-thinking leaders from diverse higher education institutions to leverage insight and action analytics in their ongoing efforts to help students learn well and finish strong. We define insight and action analytics as drawing, federating, and analyzing data from…

  6. Learning Analytics: Potential for Enhancing School Library Programs

    ERIC Educational Resources Information Center

    Boulden, Danielle Cadieux

    2015-01-01

    Learning analytics has been defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. The potential use of data and learning analytics in educational contexts has caught the attention of educators and…

  7. Improvement of crystalline quality of N-polar AlN layers on c-plane sapphire by low-pressure flow-modulated MOCVD

    NASA Astrophysics Data System (ADS)

    Takeuchi, M.; Shimizu, H.; Kajitani, R.; Kawasaki, K.; Kumagai, Y.; Koukitu, A.; Aoyagi, Y.

    2007-01-01

    The growth of N-polar AlN layers on c-plane sapphire is reported. Low-temperature AlN (LT-AlN) layers were used as seeding buffer layers with pre-nitridation for sapphire. To avoid strong vapor-phase reaction between trimethylaluminum (TMA) and ammonia (NH 3) and to improve the crystalline quality, low-pressure flow-modulated (FM) metal-organic chemical vapor deposition (MOCVD) technique was introduced with careful optimization of the FM sequence. The surface morphologies and the crystalline quality defined by the X-ray diffraction (XRD) (0 0 2) and (1 0 0) rocking curve measurements strongly depended on the LT-AlN thickness and on the TMA coverage per cycle of the FM growth. The sample showing the best XRD data with a good morphology was almost completely etched in aqueous KOH solution owing to N-polarity. From the plan-view transmission electron microscopy (TEM) observation, the dislocation density was counted to be about 3×10 10 cm -2.

  8. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York-July 1997 through June 1999

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2006-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data for the time period addressed in this report were stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality- control samples analyzed from July 1997 through June 1999. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration and (or) low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, ammonium, calcium, chloride, specific conductance, and sulfate. The data from the potassium and sodium analytical procedures are insufficient for evaluation. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 11 of 13 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. Blank analysis results for chloride showed that 22 percent of blanks did not meet data-quality objectives and results for dissolved organic carbon showed that 31 percent of the blanks did not meet data-quality objectives. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 14 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except total aluminum (70 percent of samples met objectives) and potassium (83 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality for most constituents over the time period. The P-sample (low-ionic-strength constituents) analysis had good ratings in two of these studies and a satisfactory rating in the third. The results of the T-sample (trace constituents) analysis indicated high data quality with good ratings in all three studies. The N-sample (nutrient constituents) studies had one each of excellent, good, and satisfactory ratings. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 80 percent of the samples met data-quality objectives for 9 of the 13 analytes; the exceptions were dissolved organic carbon, ammonium, chloride, and specific conductance. Data-quality objectives were not met for dissolved organic carbon in two NWRI studies, but all of the samples were within control limits for the last study. Data-quality objectives were not met in 41 percent of samples analyzed for ammonium, 25 percent of samples analyzed for chloride, and 30 percent of samples analyzed for specific conductance. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 84 percent of the samples analyzed for calcium, chloride, magnesium, pH, and potassium. Data-quality objectives were met by 73 percent of those analyzed for sulfate. The data-quality objective was not met for sodium. The data are insufficient for evaluation of the specific conductance results.

  9. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  10. Assessment of HIV/AIDS comprehensive correct knowledge among Sudanese university: a cross-sectional analytic study 2014.

    PubMed

    Elbadawi, Abdulateef; Mirghani, Hyder

    2016-01-01

    Comprehensive correct HIV/AIDS knowledge (CCAK) is defined as correctly identify the two major ways of preventing the sexual transmission of HIV, and reject the most common misconceptions about HIV transmission. There are limited studies on this topic in Sudan. In this study we investigated the Comprehensive correct HIV/AIDS knowledge among Universities students. A cross-sectional analytic study was conducted among 556 students from two universities in 2014. Data were collected by using the self-administered pre-tested structured questionnaire. Chi-square was used for testing the significance and P. Value of ≥ 0.05 is considered as statistically significant. The majority (97.1%) of study subjects have heard about a disease called HIV/AIDS, while only 28.6% of them knew anyone who is infected with AIDS in the local community. Minority (13.8%) of students had CCAK however, males showed a better level of CCAK than females (OR = 2.77) with high significant statistical differences (P. Value = 0.001). Poor rate of CCAK among university students is noticed, especially among females. Almost half of students did not know preventive measures of HIV, nearly two thirds had misconception, about one third did not know the mode of transmission of HIV.

  11. Collapse of chain anadiplosis-structured DNA nanowires for highly sensitive colorimetric assay of nucleic acids.

    PubMed

    Xu, Jianguo; Wu, Zai-Sheng; Chen, Yanru; Zheng, Tingting; Le, Jingqing; Jia, Lee

    2017-02-14

    In this work, we have proposed a chain anadiplosis-structured DNA nanowire by using two well-defined assembly strands (AS1 and AS2). The presence of a target analyte would drive the single-stranded AS1 dissociate from the pre-formatted nanowire, converting into a fully double-stranded form responsible for extensive accumulation of G-rich cleavage fragment1 (GCF1) because of an autonomously performed polymerization/nicking/displacement process. In turn, the produced GCF1 is able to hybridize with the un-peeled AS2, allowing the replication over AS2 to occur and generate large amounts of G-rich cleavage fragment2 (GCF2) with the ability to hybridize with the un-peeled AS1, thereafter initiating new enzymatic reactions for further collection of GCF1. Because the reactions occur repeatedly, the assembled nanowires gradually dissociated and completely collapsed in the end, achieving the goal of substantial signal amplification for the colorimetric readout of the target analytes. The sensing feasibility is firstly verified by one trigger primer (TP), and then exemplified with the detection of the target, the kras oncogene, with high sensitivity and specificity. As a proof-of-concept strategy, the intelligent signal readout pathway and desired assay ability provide unique insights into the materials research and biological studies.

  12. Quality by design in the chiral separation strategy for the determination of enantiomeric impurities: development of a capillary electrophoresis method based on dual cyclodextrin systems for the analysis of levosulpiride.

    PubMed

    Orlandini, S; Pasquini, B; Del Bubba, M; Pinzauti, S; Furlanetto, S

    2015-02-06

    Quality by design (QbD) concepts, in accordance with International Conference on Harmonisation Pharmaceutical Development guideline Q8(R2), represent an innovative strategy for the development of analytical methods. In this paper QbD principles have been comprehensively applied in the set-up of a capillary electrophoresis method aimed to quantify enantiomeric impurities. The test compound was the chiral drug substance levosulpiride (S-SUL) and the developed method was intended to be used for routine analysis of the pharmaceutical product. The target of analytical QbD approach is to establish a design space (DS) of critical process parameters (CPPs) where the critical quality attributes (CQAs) of the method have been assured to fulfil the desired requirements with a selected probability. QbD can improve the understanding of the enantioseparation process, including both the electrophoretic behavior of enantiomers and their separation, therefore enabling its control. The CQAs were represented by enantioresolution and analysis time. The scouting phase made it possible to select a separation system made by sulfated-β-cyclodextrin and a neutral cyclodextrin, operating in reverse polarity mode. The type of neutral cyclodextrin was included among other CPPs, both instrumental and related to background electrolyte composition, which were evaluated in a screening phase by an asymmetric screening matrix. Response surface methodology was carried out by a Doehlert design and allowed the contour plots to be drawn, highlighting significant interactions between some of the CPPs. DS was defined by applying Monte-Carlo simulations, and corresponded to the following intervals: sulfated-β-cyclodextrin concentration, 9-12 mM; methyl-β-cyclodextrin concentration, 29-38 mM; Britton-Robinson buffer pH, 3.24-3.50; voltage, 12-14 kV. Robustness of the method was examined by a Plackett-Burman matrix and the obtained results, together with system repeatability data, led to define a method control strategy. The method was validated and was finally applied to determine the enantiomeric purity of S-SUL in pharmaceutical dosage forms. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Defining the challenges of the Modern Analytical Laboratory (CPSA USA 2014): the risks and reality of personalized healthcare.

    PubMed

    Weng, Naidong; Needham, Shane; Lee, Mike

    2015-01-01

    The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.

  14. Self-Identified "Linguistic Microaggressions" among Monolingual Pre-Service Teachers: Why They Matter for English Language Learners

    ERIC Educational Resources Information Center

    Shim, Jenna

    2017-01-01

    Using the concept of "racial microaggressions" as an analytical tool, this study reports on white monolingual pre-service teachers' self-identified linguistic microaggressions by exploring their attitudinal and affective responses to those who speak languages other than English. The assumption is that teachers' pedagogical practices and…

  15. Fostering Awareness through Transmediation: Preparing Pre-Service Teachers for Critical Engagement with Multicultural Literature

    ERIC Educational Resources Information Center

    Hadjioannou, Xenia; Hutchinson, Mary

    2014-01-01

    Research has extolled the potential of transmediation in expanding learners' analytical and critical insight. However, this approach requires teachers prepared to employ this multimodal way of knowing. This study examines the impact of transmediation course experiences on pre-service teachers' comprehension of and critical engagement with…

  16. A Case Study on Pre-Service Teachers Students' Interaction with Graphical Artefacts

    ERIC Educational Resources Information Center

    Olande, Oduor

    2014-01-01

    This study reports from a pre-service teacher's online learning and assessment activity on determining variability of two graphical artefacts. Using a critical-analytical perspective to data, the present study indicates that the prospective teachers surveyed showed awareness of relevant subject specific operators and methods; however, these seem…

  17. Predicting Pre-Service Teachers' Opposition to Inclusion of Students with Disabilities: A Path Analytic Study

    ERIC Educational Resources Information Center

    Crowson, H. Michael; Brandes, Joyce A.

    2014-01-01

    This study addressed predictors of pre-service teachers' opposition toward the practice of educating students with disabilities in mainstream classroom settings--a practice known as inclusion. We tested a hypothesized path model that incorporated social dominance orientation (SDO) and contact as distal predictors, and intergroup anxiety,…

  18. Establishment of Protocols for Global Metabolomics by LC-MS for Biomarker Discovery.

    PubMed

    Saigusa, Daisuke; Okamura, Yasunobu; Motoike, Ikuko N; Katoh, Yasutake; Kurosawa, Yasuhiro; Saijyo, Reina; Koshiba, Seizo; Yasuda, Jun; Motohashi, Hozumi; Sugawara, Junichi; Tanabe, Osamu; Kinoshita, Kengo; Yamamoto, Masayuki

    2016-01-01

    Metabolomics is a promising avenue for biomarker discovery. Although the quality of metabolomic analyses, especially global metabolomics (G-Met) using mass spectrometry (MS), largely depends on the instrumentation, potential bottlenecks still exist at several basic levels in the metabolomics workflow. Therefore, we established a precise protocol initially for the G-Met analyses of human blood plasma to overcome some these difficulties. In our protocol, samples are deproteinized in a 96-well plate using an automated liquid-handling system, and conducted either using a UHPLC-QTOF/MS system equipped with a reverse phase column or a LC-FTMS system equipped with a normal phase column. A normalization protocol of G-Met data was also developed to compensate for intra- and inter-batch differences, and the variations were significantly reduced along with our normalization, especially for the UHPLC-QTOF/MS data with a C18 reverse-phase column for positive ions. Secondly, we examined the changes in metabolomic profiles caused by the storage of EDTA-blood specimens to identify quality markers for the evaluation of the specimens' pre-analytical conditions. Forty quality markers, including lysophospholipids, dipeptides, fatty acids, succinic acid, amino acids, glucose, and uric acid were identified by G-Met for the evaluation of plasma sample quality and established the equation of calculating the quality score. We applied our quality markers to a small-scale study to evaluate the quality of clinical samples. The G-Met protocols and quality markers established here should prove useful for the discovery and development of biomarkers for a wider range of diseases.

  19. Stakeholder involvement in establishing a milk quality sub-index in dairy cow breeding goals: a Delphi approach.

    PubMed

    Henchion, M; McCarthy, M; Resconi, V C; Berry, D P; McParland, S

    2016-05-01

    The relative weighting on traits within breeding goals are generally determined by bio-economic models or profit functions. While such methods have generally delivered profitability gains to producers, and are being expanded to consider non-market values, current approaches generally do not consider the numerous and diverse stakeholders that affect, or are affected, by such tools. Based on principles of respondent anonymity, iteration, controlled feedback and statistical aggregation of feedback, a Delphi study was undertaken to gauge stakeholder opinion of the importance of detailed milk quality traits within an overall dairy breeding goal for profit, with the aim of assessing its suitability as a complementary, participatory approach to defining breeding goals. The questionnaires used over two survey rounds asked stakeholders: (a) their opinion on incorporating an explicit sub-index for milk quality into a national breeding goal; (b) the importance they would assign to a pre-determined list of milk quality traits and (c) the (relative) weighting they would give such a milk quality sub-index. Results from the survey highlighted a good degree of consensus among stakeholders on the issues raised. Similarly, revelation of the underlying assumptions and knowledge used by stakeholders to make their judgements illustrated their ability to consider a range of perspectives when evaluating traits, and to reconsider their answers based on the responses and rationales given by others, which demonstrated social learning. Finally, while the relative importance assigned by stakeholders in the Delphi survey (4% to 10%) and the results of calculations based on selection index theory of the relative emphasis that should be placed on milk quality to halt any deterioration (16%) are broadly in line, the difference indicates the benefit of considering more than one approach to determining breeding goals. This study thus illustrates the role of the Delphi technique, as a complementary approach to traditional approaches, to defining breeding goals. This has implications for how breeding goals will be defined and in determining who should be involved in the decision-making process.

  20. Classroom Quality at Pre-kindergarten and Kindergarten and Children’s Social Skills and Behavior Problems

    PubMed Central

    Broekhuizen, Martine L.; Mokrova, Irina L.; Burchinal, Margaret R.; Garrett-Peters, Patricia T.

    2016-01-01

    Focusing on the continuity in the quality of classroom environments as children transition from preschool into elementary school, this study examined the associations between classroom quality in pre-kindergarten and kindergarten and children’s social skills and behavior problems in kindergarten and first grade. Participants included 1175 ethnically-diverse children (43% African American) living in low-wealth rural communities of the US. Results indicated that children who experienced higher levels of emotional and organizational classroom quality in both pre-kindergarten and kindergarten demonstrated better social skills and fewer behavior problems in both kindergarten and first grade comparing to children who did not experience higher classroom quality. The examination of the first grade results indicated that the emotional and organizational quality of pre-kindergarten classrooms was the strongest predictor of children’s first grade social skills and behavior problems. The study results are discussed from theoretical, practical, and policy perspectives. PMID:26949286

  1. Microtube strip heat exchanger

    NASA Astrophysics Data System (ADS)

    Doty, F. D.

    1990-12-01

    Doty Scientific (DSI) believes their microtube-strip heat exchanger will contribute significantly to the following: (1) the closed Brayton cycles being pursued at MIT, NASA, and elsewhere; (2) reverse Brayton cycle cryocoolers, currently being investigated by NASA for space missions, being applied to MRI superconducting magnets; and (3) high-efficiency cryogenic gas separation schemes for CO2 removal from exhaust stacks. The goal of this current study is to show the potential for substantial progress in high-effectiveness, low-cost, gas-to-gas heat exchangers for diverse applications at temperatures from below 100 K to above 1000 K. To date, the highest effectiveness measured is about 98 percent and relative pressure drops below 0.1 percent with a specific conductance of about 45 W/kgK are reported. During the pre-award period DSI built and tested a 3-module heat exchanger bank using 103-tube microtube strip (MTS) modules. To add to their analytical capabilities, DSI has acquired computational fluid dynamics (CFD) software. This report describes the pre-award work and the status of the ten tasks of the current project, which are: analyze flow distribution and thermal stresses within individual modules; design a heat exchanger bank of ten modules with 400 microtube per module; obtain production quality tubestrip die and AISI 304 tubestrips; obtain production quality microtubing; construct revised MTS heat exchanger; construct dies and fixtures for prototype heat exchanger; construct 100 MTS modules; assemble 8 to 10 prototype MTS heat exchangers; test prototype MTS heat exchanger; and verify test through independent means.

  2. Quality assurance for health and environmental chemistry: 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gautier, M.A.; Gladney, E.S.; Koski, N.L.

    1991-10-01

    This report documents the continuing quality assurance efforts of the Health and Environmental Chemistry Group (HSE-9) at the Los Alamos National Laboratory. The philosophy, methodology, computing resources, and laboratory information management system used by the quality assurance program to encompass the diversity of analytical chemistry practiced in the group are described. Included in the report are all quality assurance reference materials used, along with their certified or consensus concentrations, and all analytical chemistry quality assurance measurements made by HSE-9 during 1990.

  3. 'Whose failure counts?' A critical reflection on definitions of failure for community health volunteers providing HIV self-testing in a community-based HIV/TB intervention study in urban Malawi.

    PubMed

    Sambakunsi, Rodrick; Kumwenda, Moses; Choko, Augustine; Corbett, Elizabeth L; Desmond, Nicola Ann

    2015-12-01

    The category of community health worker applied within the context of health intervention trials has been promoted as a cost-effective approach to meeting study objectives across large populations, relying on the promotion of the concept of 'community belonging' to encourage altruistic volunteerism from community members to promote health. This community-based category of individuals is recruited to facilitate externally driven priorities defined by large research teams, outside of the target research environment. An externally defined intervention is then 'brought to' the community through locally recruited community volunteers who form a bridge between the researchers and participants. The specific role of these workers is context-driven and responsive to the needs of the intervention. This paper is based on the findings from an annual evaluation of community health worker performance employed as community counsellors to deliver semi-supervised HIV self-testing (HIVST) at community level of a large HIV/TB intervention trial conducted in urban Blantyre, Malawi. A performance evaluation was conducted to appraise individual service delivery and assess achievements in meeting pre-defined targets for uptake of HIVST with the aim of improving overall uptake of HIVST. Through an empirical 'evaluation of the evaluation' this paper critically reflects on the position of the community volunteer through the analytical lens of 'failure', exploring the tensions in communication and interpretation of intervention delivery between researchers and community volunteers and the differing perspectives on defining failure. It is concluded that community interventions should be developed in collaboration with the population and that information guiding success should be clearly defined.

  4. ‘Whose failure counts?’ A critical reflection on definitions of failure for community health volunteers providing HIV self-testing in a community-based HIV/TB intervention study in urban Malawi

    PubMed Central

    Sambakunsi, Rodrick; Kumwenda, Moses; Choko, Augustine; Corbett, Elizabeth L.; Desmond, Nicola Ann

    2015-01-01

    The category of community health worker applied within the context of health intervention trials has been promoted as a cost-effective approach to meeting study objectives across large populations, relying on the promotion of the concept of ‘com-munity belonging’ to encourage altruistic volunteerism from community members to promote health. This community-based category of individuals is recruited to facilitate externally driven priorities defined by large research teams, outside of the target research environment. An externally defined intervention is then ‘brought to’ the community through locally recruited community volunteers who form a bridge between the researchers and participants. The specific role of these workers is context-driven and responsive to the needs of the intervention. This paper is based on the findings from an annual evaluation of community health worker performance employed as community counsellors to deliver semi-supervised HIV self-testing (HIVST) at community level of a large HIV/TB intervention trial conducted in urban Blantyre, Malawi. A performance evaluation was conducted to appraise individual service delivery and assess achievements in meeting pre-defined targets for uptake of HIVST with the aim of improving overall uptake of HIVST. Through an empirical ‘evaluation of the evaluation’ this paper critically reflects on the position of the community volunteer through the analytical lens of ‘failure’, exploring the tensions in communication and interpretation of intervention delivery between researchers and community volunteers and the differing perspectives on defining failure. It is concluded that community interventions should be developed in collaboration with the population and that information guiding success should be clearly defined. PMID:26762610

  5. Meaningful Investments in Pre-K: Estimating the Per-Child Costs of Quality Programs. Pre-K Now Research Series

    ERIC Educational Resources Information Center

    Gault, Barbara; Mitchell, Anne W.; Williams, Erica

    2008-01-01

    Pre-kindergarten programs are expanding in states around the nation. Decades of research on the impact of these programs show that high quality standards produce substantial benefits for children, working families, and communities. As state leaders seek to increase investments in pre-k, they face a number of choices and potential tradeoffs that…

  6. Hyperspectral imaging and its applications

    NASA Astrophysics Data System (ADS)

    Serranti, S.; Bonifazi, G.

    2016-04-01

    Hyperspectral imaging (HSI) is an emerging technique that combines the imaging properties of a digital camera with the spectroscopic properties of a spectrometer able to detect the spectral attributes of each pixel in an image. For these characteristics, HSI allows to qualitatively and quantitatively evaluate the effects of the interactions of light with organic and/or inorganic materials. The results of this interaction are usually displayed as a spectral signature characterized by a sequence of energy values, in a pre-defined wavelength interval, for each of the investigated/collected wavelength. Following this approach, it is thus possible to collect, in a fast and reliable way, spectral information that are strictly linked to chemical-physical characteristics of the investigated materials and/or products. Considering that in an hyperspectral image the spectrum of each pixel can be analyzed, HSI can be considered as one of the best nondestructive technology allowing to perform the most accurate and detailed information extraction. HSI can be applied in different wavelength fields, the most common are the visible (VIS: 400-700 nm), the near infrared (NIR: 1000-1700 nm) and the short wave infrared (SWIR: 1000-2500 nm). It can be applied for inspections from micro- to macro-scale, up to remote sensing. HSI produces a large amount of information due to the great number of continuous collected spectral bands. Such an approach, when successful, is quite challenging being usually reliable, robust and characterized by lower costs, if compared with those usually associated to commonly applied analytical off-line and/or on-line analytical approaches. More and more applications have been thus developed and tested, in these last years, especially in food inspection, with a large range of investigated products, such as fruits and vegetables, meat, fish, eggs and cereals, but also in medicine and pharmaceutical sector, in cultural heritage, in material characterization and in waste recycling. Examples of some application, based on HSI, originally developed by the authors, are presented, critically analyzed and discussed, with reference to the different hardware configuration and logics utilized to perform the analysis, according to the characterization, inspection and quality control actions to apply.

  7. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.

  8. Pre-Calculus Instructional Guide for Elementary Functions, Analytic Geometry.

    ERIC Educational Resources Information Center

    Montgomery County Public Schools, Rockville, MD.

    This is a guide for use in semester-long courses in Elementary Functions and Analytic Geometry. A list of entry-level skills and a list of approved textbooks is provided. Each of the 18 units consists of: (1) overview, suggestions for teachers, and suggested time; (2) list of objectives; (3) cross-references guide to approved textbooks; (4) sample…

  9. Teachers as Producers of Data Analytics: A Case Study of a Teacher-Focused Educational Data Science Program

    ERIC Educational Resources Information Center

    McCoy, Chase; Shih, Patrick C.

    2016-01-01

    Educational data science (EDS) is an emerging, interdisciplinary research domain that seeks to improve educational assessment, teaching, and student learning through data analytics. Teachers have been portrayed in the EDS literature as users of pre-constructed data dashboards in educational technologies, with little consideration given to them as…

  10. What do we know about the non-work determinants of workers' mental health? A systematic review of longitudinal studies

    PubMed Central

    2011-01-01

    Background In the past years, cumulative evidence has convincingly demonstrated that the work environment is a critical determinant of workers' mental health. Nevertheless, much less attention has been dedicated towards understanding the pathways through which other pivotal life environments might also concomitantly intervene, along with the work environment, to bring about mental health outcomes in the workforce. The aim of this study consisted in conducting a systematic review examining the relative contribution of non-work determinants to the prediction of workers' mental health in order to bridge that gap in knowledge. Methods We searched electronic databases and bibliographies up to 2008 for observational longitudinal studies jointly investigating work and non-work determinants of workers' mental health. A narrative synthesis (MOOSE) was performed to synthesize data and provide an assessment of study conceptual and methodological quality. Results Thirteen studies were selected for evaluation. Seven of these were of relatively high methodological quality. Assessment of study conceptual quality yielded modest analytical breadth and depth in the ways studies conceptualized the non-work domain as defined by family, network and community/society-level indicators. We found evidence of moderate strength supporting a causal association between social support from the networks and workers' mental health, but insufficient evidence of specific indicator involvement for other analytical levels considered (i.e., family, community/society). Conclusions Largely underinvestigated, non-work determinants are important to the prediction of workers' mental health. More longitudinal studies concomitantly investigating work and non-work determinants of workers' mental health are warranted to better inform healthy workplace research, intervention, and policy. PMID:21645393

  11. Evidence for consciousness-related anomalies in random physical systems

    NASA Astrophysics Data System (ADS)

    Radin, Dean I.; Nelson, Roger D.

    1989-12-01

    Speculations about the role of consciousness in physical systems are frequently observed in the literature concerned with the interpretation of quantum mechanics. While only three experimental investigations can be found on this topic in physics journals, more than 800 relevant experiments have been reported in the literature of parapsychology. A well-defined body of empirical evidence from this domain was reviewed using meta-analytic techniques to assess methodological quality and overall effect size. Results showed effects conforming to chance expectation in control conditions and unequivocal non-chance effects in experimental conditions. This quantitative literature review agrees with the findings of two earlier reviews, suggesting the existence of some form of consciousness-related anomaly in random physical systems.

  12. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  13. Rapid Method Development in Hydrophilic Interaction Liquid Chromatography for Pharmaceutical Analysis Using a Combination of Quantitative Structure-Retention Relationships and Design of Experiments.

    PubMed

    Taraji, Maryam; Haddad, Paul R; Amos, Ruth I J; Talebi, Mohammad; Szucs, Roman; Dolan, John W; Pohl, Chris A

    2017-02-07

    A design-of-experiment (DoE) model was developed, able to describe the retention times of a mixture of pharmaceutical compounds in hydrophilic interaction liquid chromatography (HILIC) under all possible combinations of acetonitrile content, salt concentration, and mobile-phase pH with R 2 > 0.95. Further, a quantitative structure-retention relationship (QSRR) model was developed to predict retention times for new analytes, based only on their chemical structures, with a root-mean-square error of prediction (RMSEP) as low as 0.81%. A compound classification based on the concept of similarity was applied prior to QSRR modeling. Finally, we utilized a combined QSRR-DoE approach to propose an optimal design space in a quality-by-design (QbD) workflow to facilitate the HILIC method development. The mathematical QSRR-DoE model was shown to be highly predictive when applied to an independent test set of unseen compounds in unseen conditions with a RMSEP value of 5.83%. The QSRR-DoE computed retention time of pharmaceutical test analytes and subsequently calculated separation selectivity was used to optimize the chromatographic conditions for efficient separation of targets. A Monte Carlo simulation was performed to evaluate the risk of uncertainty in the model's prediction, and to define the design space where the desired quality criterion was met. Experimental realization of peak selectivity between targets under the selected optimal working conditions confirmed the theoretical predictions. These results demonstrate how discovery of optimal conditions for the separation of new analytes can be accelerated by the use of appropriate theoretical tools.

  14. Raman spectroscopy for the analytical quality control of low-dose break-scored tablets.

    PubMed

    Gómez, Diego A; Coello, Jordi; Maspoch, Santiago

    2016-05-30

    Quality control of solid dosage forms involves the analysis of end products according to well-defined criteria, including the assessment of the uniformity of dosage units (UDU). However, in the case of break-scored tablets, given that tablet splitting is widespread as a means to adjust doses, the uniform distribution of the active pharmaceutical ingredient (API) in all the possible fractions of the tablet must be assessed. A general procedure to accomplish with both issues, using Raman spectroscopy, is presented. It is based on the acquisition of a collection of spectra in different regions of the tablet, that later can be selected to determine the amount of API in the potential fractions that can result after splitting. The procedure has been applied to two commercial products, Sintrom 1 and Sintrom 4, with API (acenocoumarol) mass proportion of 2% and 0.7% respectively. Partial Least Squares (PLS) calibration models were constructed for the quantification of acenocoumarol in whole tablets using HPLC as a reference analytical method. Once validated, the calibration models were used to determine the API content in the different potential fragments of the scored Sintrom 4 tablets. Fragment mass measurements were also performed to estimate the range of masses of the halves and quarters that could result after tablet splitting. The results show that Raman spectroscopy can be an alternative analytical procedure to assess the uniformity of content, both in whole tablets as in its potential fragments, and that Sintrom 4 tablets can be perfectly split in halves, but some cautions have to be taken when considering the fragmentation in quarters. A practical alternative to the use of UDU test for the assessment of tablet fragments is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. The impact of JP-4/JP-8 conversion on aircraft engine exhaust emissions. Interim technical report Jul 75--Feb 76

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blazowski, W.S.

    1976-05-01

    The proposed conversion of predominant Air Force fuel usage from JP-4 to JP-8 has created the need to examine the dependence of engine pollutant emission on fuel type. Available data concerning the effect of fuel type on emissions has been reviewed. T56 single combustor testing has been undertaken to determine JP-4/JP-8 emission variations over a wide range of simulated engine cycle operating conditions at idle. In addition, a J85-5 engine was tested using JP-4 and JP-8. Results of the previous and new data collectively led to the following conclusions regarding conversion to JP-8: (a) HC and CO emission changes willmore » depend upon individual combustor design features, (b) no change to NOx emission will occur, and (c) an increase in smoke/particulate emissions will result. It is recommended that these findings be incorporated into air quality analytical models to define the overall impact of the proposed conversion. Further, it is recommended that combustor analytical models be employed to attempt prediction of the results described herein. Should these models be successful, analytical prediction of JP-8 emissions from other Air Force engine models may be substituted for more combustor rig or engine testing. (auth)« less

  16. The evaluation and enhancement of quality, environmental protection and seaport safety by using FAHP

    NASA Astrophysics Data System (ADS)

    Tadic, Danijela; Aleksic, Aleksandar; Popovic, Pavle; Arsovski, Slavko; Castelli, Ana; Joksimovic, Danijela; Stefanovic, Miladin

    2017-02-01

    The evaluation and enhancement of business processes in any organization in an uncertain environment presents one of the main requirements of ISO 9000:2008 and has a key effect on competitive advantage and long-term sustainability. The aim of this paper can be defined as the identification and discussion of some of the most important business processes of seaports and the performances of business processes and their key performance indicators (KPIs). The complexity and importance of the treated problem call for analytic methods rather than intuitive decisions. The existing decision variables of the considered problem are described by linguistic expressions which are modelled by triangular fuzzy numbers (TFNs). In this paper, the modified fuzzy extended analytic hierarchy process (FAHP) is proposed. The assessment of the relative importance of each pair of performances and their key performance indicators are stated as a fuzzy group decision-making problem. By using the modified fuzzy extended analytic hierarchy process, the fuzzy rank of business processes of a seaport is obtained. The model is tested through an illustrative example with real-life data, where the obtained data suggest measures which should enhance business strategy and improve key performance indicators. The future improvement is based on benchmark and knowledge sharing.

  17. Trauma system in Greece: Quo Vadis?

    PubMed

    Anagnostou, Evangelos; Larentzakis, Andreas; Vassiliu, Pantelis

    2018-05-23

    Implementation of trauma systems has markedly assisted in improving outcomes of the injured patient. However, differences exist internationally as diverse social factors, economic conditions and national particularities are placing obstacles. The purpose of this paper is to critically evaluate the current Greek trauma system, provide a comprehensive review and suggest key actions. An exhaustive search of the - scarce on this subject - English and Greek literature was carried out to analyze all the main components of the Greek trauma system, according to American College of Surgeons' criteria, as well as the WHO Trauma Systems Maturity Index. Regarding prevention, efforts are in the right direction lowering the road traffic incidents-related death rate, however rural and insular regions remain behind. Hellenic Emergency Medical Service (EKAB) has well-defined communications and emergency phone line but faces problems with educating people on how to use it properly. In addition, equal and systematic training of ambulance personnel is a challenge, with the lack of pre-hospital registry and EMS quality assessment posing a question on where the related services are currently standing. Redistribution of facilities' roles with the establishment of the first formal trauma centre in the existing infrastructure would facilitate the development of a national registry and introduction of the trauma surgeon subspecialty with proper training potential. Definite rehabilitation institutional protocols that include both inpatient and outpatient care are needed. Disaster preparedness entails an extensive national plan and regular drills, mainly at the pre-hospital level. The lack, however, of any accompanying quality assurance programs hampers the effort to yield the desirable results. Despite recent economic crisis in Greece, actions solving logistics and organising issues may offer a well-defined, integrated trauma system without uncontrollably raising the costs. Political will is needed for reforms that use pre-existing infrastructure and working power in a more efficient way, with a first line priority being the establishment of the first major trauma centre that could function as the cornerstone for the building of the Greek trauma system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Computing health quality measures using Informatics for Integrating Biology and the Bedside.

    PubMed

    Klann, Jeffrey G; Murphy, Shawn N

    2013-04-19

    The Health Quality Measures Format (HQMF) is a Health Level 7 (HL7) standard for expressing computable Clinical Quality Measures (CQMs). Creating tools to process HQMF queries in clinical databases will become increasingly important as the United States moves forward with its Health Information Technology Strategic Plan to Stages 2 and 3 of the Meaningful Use incentive program (MU2 and MU3). Informatics for Integrating Biology and the Bedside (i2b2) is one of the analytical databases used as part of the Office of the National Coordinator (ONC)'s Query Health platform to move toward this goal. Our goal is to integrate i2b2 with the Query Health HQMF architecture, to prepare for other HQMF use-cases (such as MU2 and MU3), and to articulate the functional overlap between i2b2 and HQMF. Therefore, we analyze the structure of HQMF, and then we apply this understanding to HQMF computation on the i2b2 clinical analytical database platform. Specifically, we develop a translator between two query languages, HQMF and i2b2, so that the i2b2 platform can compute HQMF queries. We use the HQMF structure of queries for aggregate reporting, which define clinical data elements and the temporal and logical relationships between them. We use the i2b2 XML format, which allows flexible querying of a complex clinical data repository in an easy-to-understand domain-specific language. The translator can represent nearly any i2b2-XML query as HQMF and execute in i2b2 nearly any HQMF query expressible in i2b2-XML. This translator is part of the freely available reference implementation of the QueryHealth initiative. We analyze limitations of the conversion and find it covers many, but not all, of the complex temporal and logical operators required by quality measures. HQMF is an expressive language for defining quality measures, and it will be important to understand and implement for CQM computation, in both meaningful use and population health. However, its current form might allow complexity that is intractable for current database systems (both in terms of implementation and computation). Our translator, which supports the subset of HQMF currently expressible in i2b2-XML, may represent the beginnings of a practical compromise. It is being pilot-tested in two Query Health demonstration projects, and it can be further expanded to balance computational tractability with the advanced features needed by measure developers.

  19. Computing Health Quality Measures Using Informatics for Integrating Biology and the Bedside

    PubMed Central

    Murphy, Shawn N

    2013-01-01

    Background The Health Quality Measures Format (HQMF) is a Health Level 7 (HL7) standard for expressing computable Clinical Quality Measures (CQMs). Creating tools to process HQMF queries in clinical databases will become increasingly important as the United States moves forward with its Health Information Technology Strategic Plan to Stages 2 and 3 of the Meaningful Use incentive program (MU2 and MU3). Informatics for Integrating Biology and the Bedside (i2b2) is one of the analytical databases used as part of the Office of the National Coordinator (ONC)’s Query Health platform to move toward this goal. Objective Our goal is to integrate i2b2 with the Query Health HQMF architecture, to prepare for other HQMF use-cases (such as MU2 and MU3), and to articulate the functional overlap between i2b2 and HQMF. Therefore, we analyze the structure of HQMF, and then we apply this understanding to HQMF computation on the i2b2 clinical analytical database platform. Specifically, we develop a translator between two query languages, HQMF and i2b2, so that the i2b2 platform can compute HQMF queries. Methods We use the HQMF structure of queries for aggregate reporting, which define clinical data elements and the temporal and logical relationships between them. We use the i2b2 XML format, which allows flexible querying of a complex clinical data repository in an easy-to-understand domain-specific language. Results The translator can represent nearly any i2b2-XML query as HQMF and execute in i2b2 nearly any HQMF query expressible in i2b2-XML. This translator is part of the freely available reference implementation of the QueryHealth initiative. We analyze limitations of the conversion and find it covers many, but not all, of the complex temporal and logical operators required by quality measures. Conclusions HQMF is an expressive language for defining quality measures, and it will be important to understand and implement for CQM computation, in both meaningful use and population health. However, its current form might allow complexity that is intractable for current database systems (both in terms of implementation and computation). Our translator, which supports the subset of HQMF currently expressible in i2b2-XML, may represent the beginnings of a practical compromise. It is being pilot-tested in two Query Health demonstration projects, and it can be further expanded to balance computational tractability with the advanced features needed by measure developers. PMID:23603227

  20. Opportunities and challenges of real-time release testing in biopharmaceutical manufacturing.

    PubMed

    Jiang, Mo; Severson, Kristen A; Love, John Christopher; Madden, Helena; Swann, Patrick; Zang, Li; Braatz, Richard D

    2017-11-01

    Real-time release testing (RTRT) is defined as "the ability to evaluate and ensure the quality of in-process and/or final drug product based on process data, which typically includes a valid combination of measured material attributes and process controls" (ICH Q8[R2]). This article discusses sensors (process analytical technology, PAT) and control strategies that enable RTRT for the spectrum of critical quality attributes (CQAs) in biopharmaceutical manufacturing. Case studies from the small-molecule and biologic pharmaceutical industry are described to demonstrate how RTRT can be facilitated by integrated manufacturing and multivariable control strategies to ensure the quality of products. RTRT can enable increased assurance of product safety, efficacy, and quality-with improved productivity including faster release and potentially decreased costs-all of which improve the value to patients. To implement a complete RTRT solution, biologic drug manufacturers need to consider the special attributes of their industry, particularly sterility and the measurement of viral and microbial contamination. Continued advances in on-line and in-line sensor technologies are key for the biopharmaceutical manufacturing industry to achieve the potential of RTRT. Related article: http://onlinelibrary.wiley.com/doi/10.1002/bit.26378/full. © 2017 Wiley Periodicals, Inc.

  1. Identification of the iron oxidation state and coordination geometry in iron oxide- and zeolite-based catalysts using pre-edge XAS analysis.

    PubMed

    Boubnov, Alexey; Lichtenberg, Henning; Mangold, Stefan; Grunwaldt, Jan Dierk

    2015-03-01

    Analysis of the oxidation state and coordination geometry using pre-edge analysis is attractive for heterogeneous catalysis and materials science, especially for in situ and time-resolved studies or highly diluted systems. In the present study, focus is laid on iron-based catalysts. First a systematic investigation of the pre-edge region of the Fe K-edge using staurolite, FePO4, FeO and α-Fe2O3 as reference compounds for tetrahedral Fe(2+), tetrahedral Fe(3+), octahedral Fe(2+) and octahedral Fe(3+), respectively, is reported. In particular, high-resolution and conventional X-ray absorption spectra are compared, considering that in heterogeneous catalysis and material science a compromise between high-quality spectroscopic data acquisition and simultaneous analysis of functional properties is required. Results, which were obtained from reference spectra acquired with different resolution and quality, demonstrate that this analysis is also applicable to conventionally recorded pre-edge data. For this purpose, subtraction of the edge onset is preferentially carried out using an arctangent and a first-degree polynomial, independent of the resolution and quality of the data. For both standard and high-resolution data, multiplet analysis of pre-edge features has limitations due to weak transitions that cannot be identified. On the other hand, an arbitrary empirical peak fitting assists the analysis in that non-local transitions can be isolated. The analysis of the oxidation state and coordination geometry of the Fe sites using a variogram-based method is shown to be effective for standard-resolution data and leads to the same results as for high-resolution spectra. This method, validated by analysing spectra of reference compounds and their well defined mixtures, is finally applied to track structural changes in a 1% Fe/Al2O3 and a 0.5% Fe/BEA zeolite catalyst during reduction in 5% H2/He. The results, hardly accessible by other techniques, show that Fe(3+) is transformed into Fe(2+), while the local Fe-O coordination number of 4-5 is maintained, suggesting that the reduction involves a rearrangement of the oxygen neighbours rather than their removal. In conclusion, the variogram-based analysis of Fe K-edge spectra proves to be very useful in catalysis research.

  2. A European Sustainable Tourism Labels proposal using a composite indicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blancas, Francisco Javier, E-mail: fjblaper@upo.es; Lozano-Oyola, Macarena, E-mail: mlozoyo@upo.es; González, Mercedes, E-mail: m_gonzalez@uma.es

    The tourism sector in Europe faces important challenges which it must deal with to promote its future development. In this context, the European Commission considers that two key issues must be addressed. On the one hand, a better base of socio-economic knowledge about tourism and its relationship with the environment is needed, and, on the other hand, it is necessary to improve the image of European areas as quality sustainable tourism destinations. In this paper we present analytical tools that cover these needs. Specifically, we define a system of sustainable tourism indicators and we obtain a composite indicator incorporating weightsmore » quantified using a panel of experts. Employing the values of this global indicator as a basis, we define a Sustainable Tourism Country-Brand Ranking which assesses the perception of each country-brand depending on its degree of sustainability, and a system of sustainable tourism labels which reward the management carried out. - Highlights: • We define a system of indicators to improve the knowledge about sustainable tourism. • We obtain composite indicators based on expert knowledge. • The Sustainable Tourism Country-Brand Ranking would improve the image of destinations. • We define a Sustainable Tourism Labels System to assess country-brands. • The conclusions of the empirical analysis can be extrapolated to other tourist areas.« less

  3. Towards an Analytical Framework for Understanding the Development of a Quality Assurance System in an International Joint Programme

    ERIC Educational Resources Information Center

    Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang

    2017-01-01

    This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…

  4. Quantitative evaluation of the CEEM soil sampling intercomparison.

    PubMed

    Wagner, G; Lischer, P; Theocharopoulos, S; Muntau, H; Desaules, A; Quevauviller, P

    2001-01-08

    The aim of the CEEM soil project was to compare and to test the soil sampling and sample preparation guidelines used in the member states of the European Union and Switzerland for investigations of background and large-scale contamination of soils, soil monitoring and environmental risk assessments. The results of the comparative evaluation of the sampling guidelines demonstrated that, in soil contamination studies carried out with different sampling strategies and methods, comparable results can hardly be expected. Therefore, a reference database (RDB) was established by the organisers, which acted as a basis for the quantitative comparison of the participants' results. The detected deviations were related to the methodological details of the individual strategies. The comparative evaluation concept consisted of three steps: The first step was a comparison of the participants' samples (which were both centrally and individually analysed) between each other, as well as with the reference data base (RDB) and some given soil quality standards on the level of concentrations present. The comparison was made using the example of the metals cadmium, copper, lead and zinc. As a second step, the absolute and relative deviations between the reference database and the participants' results (both centrally analysed under repeatability conditions) were calculated. The comparability of the samples with the RDB was categorised on four levels. Methods of exploratory statistical analysis were applied to estimate the differential method bias among the participants. The levels of error caused by sampling and sample preparation were compared with those caused by the analytical procedures. As a third step, the methodological profiles of the participants were compiled to concisely describe the different procedures used. They were related to the results to find out the main factors leading to their incomparability. The outcome of this evaluation process was a list of strategies and methods, which are problematic with respect to comparability, and should be standardised and/or specified in order to arrive at representative and comparable results in soil contamination studies throughout Europe. Pre-normative recommendations for harmonising European soil sampling guidelines and standard operating procedures have been outlined in Wagner G, Desules A, Muntau H, Theocharopoulos S. Comparative Evaluation of European Methods for Sampling and Sample Preparation of Soils for Inorganic Analysis (CEEM Soil). Final Report of the Contract SMT4-CT96-2085, Sci Total Environ 2001;264:181-186. Wagner G, Desaules A, Munatu H. Theocharopolous S, Quevauvaller Ph. Suggestions for harmonising sampling and sample pre-treatment procedures and improving quality assurance in pre-analytical steps of soil contamination studies. Paper 1.7 Sci Total Environ 2001b;264:103-118.

  5. Authentic Education, the Deeper and Multidisciplinary Perspective of Education, from the Viewpoint of Analytical Psychology

    ERIC Educational Resources Information Center

    Watagodakumbura, Chandana

    2014-01-01

    In this paper, the authentic education system defined with multidisciplinary perspectives (Watagodakumbura, 2013a, 2013b) is viewed from an additional perspective of analytical psychology. Analytical psychology provides insights into human development and is becoming more and more popular among practicing psychologist in the recent past. In…

  6. Clinical and electrophysiological characteristics of symmetric polyneuropathy in a cohort of systemic lupus erythematosus patients.

    PubMed

    Jasmin, R; Sockalingam, S; Ramanaidu, L P; Goh, K J

    2015-03-01

    Peripheral neuropathy in systemic lupus erythematosus (SLE) is heterogeneous and its commonest pattern is symmetrical polyneuropathy. The aim of this study was to describe the prevalence, clinical and electrophysiological features, disease associations and effects on function and quality of life of polyneuropathy in SLE patients, defined using combined clinical and electrophysiological diagnostic criteria. Consecutive SLE patients seen at the University of Malaya Medical Centre were included. Patients with medication and other disorders known to cause neuropathy were excluded. Demographic, clinical and laboratory data were obtained using a pre-defined questionnaire. Function and health-related quality of life was assessed using the modified Rankin scale and the SF-36 scores. Nerve conduction studies (NCS) were carried out in both upper and lower limbs. Polyneuropathy was defined as the presence of bilateral clinical symptoms and/or signs and bilateral abnormal NCS parameters. Of 150 patients, 23 (15.3%) had polyneuropathy. SLE-related polyneuropathy was mainly characterized by sensory symptoms of numbness/tingling and pain with mild signs of absent ankle reflexes and reduced pain sensation. Function was minimally affected and there were no differences in quality of life scores. NCS abnormalities suggested mild length-dependent axonal neuropathy, primarily in the distal lower limbs. Compared to those without polyneuropathy, SLE-related polyneuropathy patients were significantly older but had no other significant demographic or disease associations. SLE-related polyneuropathy is a chronic, axonal and predominantly sensory neuropathy, associated with older age. Its underlying pathogenetic mechanisms are unknown, although a possibility could be an increased susceptibility of peripheral nerves in SLE patients to effects of aging. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  8. Measurement of antinuclear antibodies and their fine specificities: time for a change in strategy?

    PubMed

    Otten, Henny G; Brummelhuis, Walter J; Fritsch-Stork, Ruth; Leavis, Helen L; Wisse, Bram W; van Laar, Jacob M; Derksen, Ronald H W M

    2017-01-01

    The current strategy for antinuclear antibody (ANA) analysis involves screening for presence with a subsequent detailed analysis of their specificity. The aim of this study is to compare the clinical and financial efficacy of this strategy between different commercial tests in a large cohort of unselected patients. In all consecutive 1030 patients associations were defined between results from different ANA test systems and the pre-test probability for connective tissue disease (CTDs). Test systems were used for screening (ANA-IIF vs. CTD screen) and definition of their fine specificity (profile 3 line blot vs. CTD single analytes). Positive ANA-IIF and/or CTD screen results were found in 304 sera. Further analysis for ANA-specificity by profile 3 line blot and CTD single analytes showed 86 discrepant results of which more than a third are clinically relevant, with the CTD single analyte assay performing better than the line blot in supporting or confirming the presence of a CTD. Autoantigens present in one test but absent in the other were of minor practical use. The ANA screening and identification strategies currently employed are not cost-effective as 83% of tests were performed in order to find specific autoantibodies in patients without the fitting clinical signs or symptoms. This causes many unexpected positive results and subsequent confusion with regard to interpretation. We advocate that some autoantigens should be excluded from the line blot and CTD assays and propose the use of a cost-effective and selective ANA specificity testing purely based on clinical guidance.

  9. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation, and reporting as assessed by an external quality assurance program.

    PubMed

    Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre

    2011-11-01

    The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.

  10. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  11. 42 CFR 493.803 - Condition: Successful participation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., subspecialty, and analyte or test in which the laboratory is certified under CLIA. (b) Except as specified in... a given specialty, subspecialty, analyte or test, as defined in this section, or fails to take...

  12. 42 CFR 493.803 - Condition: Successful participation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., subspecialty, and analyte or test in which the laboratory is certified under CLIA. (b) Except as specified in... a given specialty, subspecialty, analyte or test, as defined in this section, or fails to take...

  13. 42 CFR 493.803 - Condition: Successful participation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., subspecialty, and analyte or test in which the laboratory is certified under CLIA. (b) Except as specified in... a given specialty, subspecialty, analyte or test, as defined in this section, or fails to take...

  14. 42 CFR 493.803 - Condition: Successful participation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., subspecialty, and analyte or test in which the laboratory is certified under CLIA. (b) Except as specified in... a given specialty, subspecialty, analyte or test, as defined in this section, or fails to take...

  15. Interest and limits of the six sigma methodology in medical laboratory.

    PubMed

    Scherrer, Florian; Bouilloux, Jean-Pierre; Calendini, Ors'Anton; Chamard, Didier; Cornu, François

    2017-02-01

    The mandatory accreditation of clinical laboratories in France provides an incentive to develop real tools to measure performance management methods and to optimize the management of internal quality controls. Six sigma methodology is an approach commonly applied to software quality management and discussed in numerous publications. This paper discusses the primary factors that influence the sigma index (the choice of the total allowable error, the approach used to address bias) and compares the performance of different analyzers on the basis of the sigma index. Six sigma strategy can be applied to the policy management of internal quality control in a laboratory and demonstrates through a comparison of four analyzers that there is no single superior analyzer in clinical chemistry. Similar sigma results are obtained using approaches toward bias based on the EQAS or the IQC. The main difficulty in using the six sigma methodology lies in the absence of official guidelines for the definition of the total error acceptable. Despite this drawback, our comparison study suggests that difficulties with defined analytes do not vary with the analyzer used.

  16. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  17. Pre-Kindergarten: Research-Based Recommendations for Developing Standards and Factors Contributing to School Readiness Gaps. Information Capsule. Volume 1201

    ERIC Educational Resources Information Center

    Blazer, Christie

    2012-01-01

    States across the country are developing pre-kindergarten standards that articulate expectations for preschooler's learning and development and define the manner in which services will be provided. There are two different types of standards: student outcome standards and program standards. Student outcome standards define the knowledge and skills…

  18. Preentry communications study. Outer planets atmospheric entry probe

    NASA Technical Reports Server (NTRS)

    Hinrichs, C. A.

    1976-01-01

    A pre-entry communications study is presented for a relay link between a Jupiter entry probe and a spacecraft in hyperbolic orbit. Two generic communications links of interest are described: a pre-entry link to a spun spacecraft antenna, and a pre-entry link to a despun spacecraft antenna. The propagation environment of Jupiter is defined. Although this is one of the least well known features of Jupiter, enough information exists to reasonably establish bounds on the performance of a communications link. Within these bounds, optimal carrier frequencies are defined. The next step is to identify optimal relative geometries between the probe and the spacecraft. Optimal trajectories are established for both spun and despun spacecraft antennas. Given the optimal carrier frequencies, and the optimal trajectories, the data carrying capacities of the pre-entry links are defined. The impact of incorporating pre-entry communications into a basic post entry probe is then assessed. This assessment covers the disciplines of thermal control, power source, mass properties and design layout. A conceptual design is developed of an electronically despun antenna for use on a Pioneer class of spacecraft.

  19. Quality and Characteristics of the North Carolina Pre-Kindergarten Program: 2011-2012 Statewide Evaluation

    ERIC Educational Resources Information Center

    Peisner-Feinberg, Ellen; Schaaf, Jennifer; Hildebrandt, Lisa; LaForett, Dore

    2013-01-01

    The North Carolina Pre-Kindergarten Program (NC Pre-K) is a state-funded initiative for at-risk 4-year-olds, designed to provide a high quality, classroom-based educational program during the year prior to kindergarten entry. Children are eligible for NC Pre-K based on age, family income (at or below 75% of state median income), and other risk…

  20. Defining dignity in terminally ill cancer patients: a factor-analytic approach.

    PubMed

    Hack, Thomas F; Chochinov, Harvey Max; Hassard, Thomas; Kristjanson, Linda J; McClement, Susan; Harlos, Mike

    2004-10-01

    The construct of 'dignity' is frequently raised in discussions about quality end of life care for terminal cancer patients, and is invoked by parties on both sides of the euthanasia debate. Lacking in this general debate has been an empirical explication of 'dignity' from the viewpoint of cancer patients themselves. The purpose of the present study was to use factor-analytic and regression methods to analyze dignity data gathered from 213 cancer patients having less than 6 months to live. Patients rated their sense of dignity, and completed measures of symptom distress and psychological well-being. The results showed that although the majority of patients had an intact sense of dignity, there were 99 (46%) patients who reported at least some, or occasional loss of dignity, and 16 (7.5%) patients who indicated that loss of dignity was a significant problem. The exploratory factor analysis yielded six primary factors: (1) Pain; (2) Intimate Dependency; (3) Hopelessness/Depression; (4) Informal Support Network; (5) Formal Support Network; and (6) Quality of Life. Subsequent regression analyses of modifiable factors produced a final two-factor (Hopelessness/Depression and Intimate Dependency) model of statistical significance. These results provide empirical support for the dignity model, and suggest that the provision of end of life care should include methods for treating depression, fostering hope, and facilitating functional independence. Copyright 2004 John Wiley & Sons, Ltd.

Top