Sample records for validate analytical procedures

  1. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  2. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory

  3. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    PubMed

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  5. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Consistency of FMEA used in the validation of analytical procedures.

    PubMed

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Laboratory Analytical Procedures | Bioenergy | NREL

    Science.gov Websites

    analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing

  8. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  10. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  13. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A.

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less

  14. 40 CFR 140.5 - Analytical procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...

  15. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  17. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  18. The Validity of Selection and Classification Procedures for Predicting Job Performance.

    DTIC Science & Technology

    1987-04-01

    lacholual or pulley Issues. They cemmunicate Me resulls of special analyses, Iantrim rp or phses of a teak, ad hasm quick macton werk. Paperm r reviw ...51 I. Alternative Selection Procedures ................. 56 J. Meta-Analyses of Validities ............. 58 K . Meta-Analytic Comparisons of...Aptitude Test Battery GM General Maintenance GS General Science GVN Cognitive Ability HS&T Health, Social and Technology K Motor Coordination KFM

  19. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  20. Assessing Procedural Competence: Validity Considerations.

    PubMed

    Pugh, Debra M; Wood, Timothy J; Boulet, John R

    2015-10-01

    Simulation-based medical education (SBME) offers opportunities for trainees to learn how to perform procedures and to be assessed in a safe environment. However, SBME research studies often lack robust evidence to support the validity of the interpretation of the results obtained from tools used to assess trainees' skills. The purpose of this paper is to describe how a validity framework can be applied when reporting and interpreting the results of a simulation-based assessment of skills related to performing procedures. The authors discuss various sources of validity evidence because they relate to SBME. A case study is presented.

  1. Development of an Analytical Procedure for the Determination of Multiclass Compounds for Forensic Veterinary Toxicology.

    PubMed

    Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej

    2018-04-01

    Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients <15% have been achieved for most of the analyzed substances. Analytical conditions allowed for a successful separation of variety of poisons with the typical screening detection limit at ≤10 μg/kg levels. The method was used to investigate more than 100 animals poisoning incidents and proved that is useful to be used in animal forensic toxicology cases.

  2. Optimization and validation of moving average quality control procedures using bias detection curves and moving average validation charts.

    PubMed

    van Rossum, Huub H; Kemperman, Hans

    2017-02-01

    To date, no practical tools are available to obtain optimal settings for moving average (MA) as a continuous analytical quality control instrument. Also, there is no knowledge of the true bias detection properties of applied MA. We describe the use of bias detection curves for MA optimization and MA validation charts for validation of MA. MA optimization was performed on a data set of previously obtained consecutive assay results. Bias introduction and MA bias detection were simulated for multiple MA procedures (combination of truncation limits, calculation algorithms and control limits) and performed for various biases. Bias detection curves were generated by plotting the median number of test results needed for bias detection against the simulated introduced bias. In MA validation charts the minimum, median, and maximum numbers of assay results required for MA bias detection are shown for various bias. Their use was demonstrated for sodium, potassium, and albumin. Bias detection curves allowed optimization of MA settings by graphical comparison of bias detection properties of multiple MA. The optimal MA was selected based on the bias detection characteristics obtained. MA validation charts were generated for selected optimal MA and provided insight into the range of results required for MA bias detection. Bias detection curves and MA validation charts are useful tools for optimization and validation of MA procedures.

  3. Validation of biomarkers to predict response to immunotherapy in cancer: Volume I - pre-analytical and analytical validation.

    PubMed

    Masucci, Giuseppe V; Cesano, Alessandra; Hawtin, Rachael; Janetzki, Sylvia; Zhang, Jenny; Kirsch, Ilan; Dobbin, Kevin K; Alvarez, John; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Butterfield, Lisa H; Thurin, Magdalena

    2016-01-01

    Immunotherapies have emerged as one of the most promising approaches to treat patients with cancer. Recently, there have been many clinical successes using checkpoint receptor blockade, including T cell inhibitory receptors such as cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4) and programmed cell death-1 (PD-1). Despite demonstrated successes in a variety of malignancies, responses only typically occur in a minority of patients in any given histology. Additionally, treatment is associated with inflammatory toxicity and high cost. Therefore, determining which patients would derive clinical benefit from immunotherapy is a compelling clinical question. Although numerous candidate biomarkers have been described, there are currently three FDA-approved assays based on PD-1 ligand expression (PD-L1) that have been clinically validated to identify patients who are more likely to benefit from a single-agent anti-PD-1/PD-L1 therapy. Because of the complexity of the immune response and tumor biology, it is unlikely that a single biomarker will be sufficient to predict clinical outcomes in response to immune-targeted therapy. Rather, the integration of multiple tumor and immune response parameters, such as protein expression, genomics, and transcriptomics, may be necessary for accurate prediction of clinical benefit. Before a candidate biomarker and/or new technology can be used in a clinical setting, several steps are necessary to demonstrate its clinical validity. Although regulatory guidelines provide general roadmaps for the validation process, their applicability to biomarkers in the cancer immunotherapy field is somewhat limited. Thus, Working Group 1 (WG1) of the Society for Immunotherapy of Cancer (SITC) Immune Biomarkers Task Force convened to address this need. In this two volume series, we discuss pre-analytical and analytical (Volume I) as well as clinical and regulatory (Volume II) aspects of the validation process as applied to predictive biomarkers

  4. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  5. Digital forensics: an analytical crime scene procedure model (ACSPM).

    PubMed

    Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut

    2013-12-10

    In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic

  6. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    PubMed

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  7. The analytical validation of the Oncotype DX Recurrence Score assay

    PubMed Central

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score® result (scale: 0–100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time. PMID:27729940

  8. The analytical validation of the Oncotype DX Recurrence Score assay.

    PubMed

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX ® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score ® result (scale: 0-100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time.

  9. In-Trail Procedure Air Traffic Control Procedures Validation Simulation Study

    NASA Technical Reports Server (NTRS)

    Chartrand, Ryan C.; Hewitt, Katrin P.; Sweeney, Peter B.; Graff, Thomas J.; Jones, Kenneth M.

    2012-01-01

    In August 2007, Airservices Australia (Airservices) and the United States National Aeronautics and Space Administration (NASA) conducted a validation experiment of the air traffic control (ATC) procedures associated with the Automatic Dependant Surveillance-Broadcast (ADS-B) In-Trail Procedure (ITP). ITP is an Airborne Traffic Situation Awareness (ATSA) application designed for near-term use in procedural airspace in which ADS-B data are used to facilitate climb and descent maneuvers. NASA and Airservices conducted the experiment in Airservices simulator in Melbourne, Australia. Twelve current operational air traffic controllers participated in the experiment, which identified aspects of the ITP that could be improved (mainly in the communication and controller approval process). Results showed that controllers viewed the ITP as valid and acceptable. This paper describes the experiment design and results.

  10. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Procedures For Microbial-Ecology Laboratory

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  12. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  13. An analytic survey of signing inventory procedures in Virginia.

    DOT National Transportation Integrated Search

    1972-01-01

    An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...

  14. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  15. Analytic Validation of Immunohistochemical Assays: A Comparison of Laboratory Practices Before and After Introduction of an Evidence-Based Guideline.

    PubMed

    Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Souers, Rhona J; Fatheree, Lisa A; Volmar, Keith E; Stuart, Lauren N; Nowak, Jan A; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - Laboratories must demonstrate analytic validity before any test can be used clinically, but studies have shown inconsistent practices in immunohistochemical assay validation. - To assess changes in immunohistochemistry analytic validation practices after publication of an evidence-based laboratory practice guideline. - A survey on current immunohistochemistry assay validation practices and on the awareness and adoption of a recently published guideline was sent to subscribers enrolled in one of 3 relevant College of American Pathologists proficiency testing programs and to additional nonsubscribing laboratories that perform immunohistochemical testing. The results were compared with an earlier survey of validation practices. - Analysis was based on responses from 1085 laboratories that perform immunohistochemical staining. Of 1057 responses, 65.4% (691) were aware of the guideline recommendations before this survey was sent and 79.9% (550 of 688) of those have already adopted some or all of the recommendations. Compared with the 2010 survey, a significant number of laboratories now have written validation procedures for both predictive and nonpredictive marker assays and specifications for the minimum numbers of cases needed for validation. There was also significant improvement in compliance with validation requirements, with 99% (100 of 102) having validated their most recently introduced predictive marker assay, compared with 74.9% (326 of 435) in 2010. The difficulty in finding validation cases for rare antigens and resource limitations were cited as the biggest challenges in implementing the guideline. - Dissemination of the 2014 evidence-based guideline validation practices had a positive impact on laboratory performance; some or all of the recommendations have been adopted by nearly 80% of respondents.

  16. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...

  17. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    PubMed

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  18. An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.

    PubMed

    Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A

    2000-05-01

    The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.

  19. Introduction to Validation of Analytical Methods: Potentiometric Determination of CO[subscript 2

    ERIC Educational Resources Information Center

    Hipólito-Nájera, A. Ricardo; Moya-Hernandez, M. Rosario; Gomez-Balderas, Rodolfo; Rojas-Hernandez, Alberto; Romero-Romo, Mario

    2017-01-01

    Validation of analytical methods is a fundamental subject for chemical analysts working in chemical industries. These methods are also relevant for pharmaceutical enterprises, biotechnology firms, analytical service laboratories, government departments, and regulatory agencies. Therefore, for undergraduate students enrolled in majors in the field…

  20. Validation of Analytical Damping Ratio by Fatigue Stress Limit

    NASA Astrophysics Data System (ADS)

    Foong, Faruq Muhammad; Chung Ket, Thein; Beng Lee, Ooi; Aziz, Abdul Rashid Abdul

    2018-03-01

    The optimisation process of a vibration energy harvester is usually restricted to experimental approaches due to the lack of an analytical equation to describe the damping of a system. This study derives an analytical equation, which describes the first mode damping ratio of a clamp-free cantilever beam under harmonic base excitation by combining the transverse equation of motion of the beam with the damping-stress equation. This equation, as opposed to other common damping determination methods, is independent of experimental inputs or finite element simulations and can be solved using a simple iterative convergence method. The derived equation was determined to be correct for cases when the maximum bending stress in the beam is below the fatigue limit stress of the beam. However, an increasing trend in the error between the experiment and the analytical results were observed at high stress levels. Hence, the fatigue limit stress was used as a parameter to define the validity of the analytical equation.

  1. Population Spotting Using Big Data: Validating the Human Performance Concept of Operations Analytic Vision

    DTIC Science & Technology

    2017-01-01

    AFRL-SA-WP-SR-2017-0001 Population Spotting Using “ Big Data ”: Validating the Human Performance Concept of Operations Analytic Vision...TITLE AND SUBTITLE Population Spotting Using “ Big Data ”: Validating the Human Performance Concept of Operations Analytic Vision 5a. CONTRACT...STINFO COPY NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data included in this document for any

  2. Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.

    PubMed

    Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer

    2016-09-01

    Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Validation of the procedures. [integrated multidisciplinary optimization of rotorcraft

    NASA Technical Reports Server (NTRS)

    Mantay, Wayne R.

    1989-01-01

    Validation strategies are described for procedures aimed at improving the rotor blade design process through a multidisciplinary optimization approach. Validation of the basic rotor environment prediction tools and the overall rotor design are discussed.

  4. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  5. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  6. Construct Validation of Analytic Rating Scales in a Speaking Assessment: Reporting a Score Profile and a Composite

    ERIC Educational Resources Information Center

    Sawaki, Yasuyo

    2007-01-01

    This is a construct validation study of a second language speaking assessment that reported a language profile based on analytic rating scales and a composite score. The study addressed three key issues: score dependability, convergent/discriminant validity of analytic rating scales and the weighting of analytic ratings in the composite score.…

  7. Analytical procedures for the determination of fuel combustion products, anti-corrosive compounds, and de-icing compounds in airport runoff water samples.

    PubMed

    Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek

    2013-12-15

    The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.

  8. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Empirically Optimized Flow Cytometric Immunoassay Validates Ambient Analyte Theory

    PubMed Central

    Parpia, Zaheer A.; Kelso, David M.

    2010-01-01

    Ekins’ ambient analyte theory predicts, counter intuitively, that an immunoassay’s limit of detection can be improved by reducing the amount of capture antibody. In addition, it also anticipates that results should be insensitive to the volume of sample as well as the amount of capture antibody added. The objective of this study is to empirically validate all of the performance characteristics predicted by Ekins’ theory. Flow cytometric analysis was used to detect binding between a fluorescent ligand and capture microparticles since it can directly measure fractional occupancy, the primary response variable in ambient analyte theory. After experimentally determining ambient analyte conditions, comparisons were carried out between ambient and non-ambient assays in terms of their signal strengths, limits of detection, and their sensitivity to variations in reaction volume and number of particles. The critical number of binding sites required for an assay to be in the ambient analyte region was estimated to be 0.1VKd. As predicted, such assays exhibited superior signal/noise levels and limits of detection; and were not affected by variations in sample volume and number of binding sites. When the signal detected measures fractional occupancy, ambient analyte theory is an excellent guide to developing assays with superior performance characteristics. PMID:20152793

  10. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    ERIC Educational Resources Information Center

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  11. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE... Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  12. Development and validation of procedures for assessment of competency of non-pharmacists in extemporaneous dispensing.

    PubMed

    Donnelly, Ryan F; McNally, Martin J; Barry, Johanne G

    2009-02-01

    To develop and validate procedures that may be suitable for assessment of competency of two groups of non-pharmacist staff (pharmacy students and trainee support staff) in extemporaneous dispensing. This is important given the prospect of remote supervision of community pharmacies in the UK. Analytical methods were validated according to International Conference on Harmonisation specifications and procedures were optimized to allow efficient drug extraction. This permitted straightforward determination of drug content in extemporaneously prepared lidocaine hydrochloride mouthwashes and norfloxacin creams and suspensions prepared by 10 participants recruited to represent the two groups of non-pharmacist staff. All 10 participants had completed the extemporaneous dispensing of all three products within 90 min. Extraction and analysis took approximately 15 min for each lidocaine hydrochloride mouthwash and 30 min for each diluted norfloxacin cream and norfloxacin suspension. The mean drug concentrations in lidocaine hydrochloride mouthwashes and diluted norfloxacin creams were within what are generally accepted as being pharmaceutically acceptable limits for drug content (100 +/- 5%) for both groups of participants. There was no significant difference in the mean drug concentration of norfloxacin suspensions prepared by the participant groups. However, it was notable that only one participant prepared a suspension containing a norfloxacin concentration that was within pharmaceutically acceptable limits (101.51%). A laboratory possessing suitable equipment and appropriately trained staff could cope readily with the large number of products prepared, for example, by a cohort of pre-registration students. Consequently, the validated procedures developed here could usefully be incorporated into the pre-registration examination for pharmacy students and a final qualifying examination for dispensers and pharmacy technicians. We believe that this is essential if the public

  13. Analytical and experimental validation of the Oblique Detonation Wave Engine concept

    NASA Technical Reports Server (NTRS)

    Adelman, Henry G.; Cambier, Jean-Luc; Menees, Gene P.; Balboni, John A.

    1988-01-01

    The Oblique Detonation Wave Engine (ODWE) for hypersonic flight has been analytically studied by NASA using the CFD codes which fully couple finite rate chemistry with fluid dynamics. Fuel injector designs investigated included wall and strut injectors, and the in-stream strut injectors were chosen to provide good mixing with minimal stagnation pressure losses. Plans for experimentally validating the ODWE concept in an arc-jet hypersonic wind tunnel are discussed. Measurements of the flow field properties behind the oblique wave will be compared to analytical predictions.

  14. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures.

    PubMed

    Montgomery, L D; Montgomery, R W; Guisado, R

    1995-05-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  15. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.; Montgomery, R. W.; Guisado, R.

    1995-01-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  16. Validation of urban freeway models.

    DOT National Transportation Integrated Search

    2015-01-01

    This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...

  17. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  18. Validity of diagnoses, procedures, and laboratory data in Japanese administrative data.

    PubMed

    Yamana, Hayato; Moriwaki, Mutsuko; Horiguchi, Hiromasa; Kodan, Mariko; Fushimi, Kiyohide; Yasunaga, Hideo

    2017-10-01

    Validation of recorded data is a prerequisite for studies that utilize administrative databases. The present study evaluated the validity of diagnoses and procedure records in the Japanese Diagnosis Procedure Combination (DPC) data, along with laboratory test results in the newly-introduced Standardized Structured Medical Record Information Exchange (SS-MIX) data. Between November 2015 and February 2016, we conducted chart reviews of 315 patients hospitalized between April 2014 and March 2015 in four middle-sized acute-care hospitals in Shizuoka, Kochi, Fukuoka, and Saga Prefectures and used them as reference standards. The sensitivity and specificity of DPC data in identifying 16 diseases and 10 common procedures were identified. The accuracy of SS-MIX data for 13 laboratory test results was also examined. The specificity of diagnoses in the DPC data exceeded 96%, while the sensitivity was below 50% for seven diseases and variable across diseases. When limited to primary diagnoses, the sensitivity and specificity were 78.9% and 93.2%, respectively. The sensitivity of procedure records exceeded 90% for six procedures, and the specificity exceeded 90% for nine procedures. Agreement between the SS-MIX data and the chart reviews was above 95% for all 13 items. The validity of diagnoses and procedure records in the DPC data and laboratory results in the SS-MIX data was high in general, supporting their use in future studies. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  19. IFCC approved HPLC reference measurement procedure for the alcohol consumption biomarker carbohydrate-deficient transferrin (CDT): Its validation and use.

    PubMed

    Schellenberg, François; Wielders, Jos; Anton, Raymond; Bianchi, Vincenza; Deenmamode, Jean; Weykamp, Cas; Whitfield, John; Jeppsson, Jan-Olof; Helander, Anders

    2017-02-01

    Carbohydrate-deficient transferrin (CDT) is used as a biomarker of sustained high alcohol consumption. The currently available measurement procedures for CDT are based on various analytical techniques (HPLC, capillary electrophoresis, nephelometry), some differing in the definition of the analyte and using different reference intervals and cut-off values. The Working Group on Standardization of CDT (WG-CDT), initiated by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), has validated an HPLC candidate reference measurement procedure (cRMP) for CDT (% disialotransferrin to total transferrin based on peak areas), demonstrating that it is suitable as a reference measurement procedure (RMP) for CDT. Presented is a detailed description of the cRMP and its calibration. Practical aspects on how to treat genetic variant and so-called di-tri bridge samples are described. Results of method performance characteristics, as demanded by ISO 15189 and ISO 15193, are given, as well as the reference interval and measurement uncertainty and how to deal with that in routine use. The correlation of the cRMP with commercial CDT procedures and the performance of the cRMP in a network of laboratories are also presented. The performance of the CDT cRMP in combination with previously developed commutable calibrators allows for standardization of the currently available commercial measurement procedures for CDT. The cRMP has recently been approved by the IFCC and will be from now on be known as the IFCC-RMP for CDT, while CDT results standardized according to this RMP should be indicated as CDT IFCC . Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    PubMed

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty

  1. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    PubMed

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  2. Ethical leadership: meta-analytic evidence of criterion-related and incremental validity.

    PubMed

    Ng, Thomas W H; Feldman, Daniel C

    2015-05-01

    This study examines the criterion-related and incremental validity of ethical leadership (EL) with meta-analytic data. Across 101 samples published over the last 15 years (N = 29,620), we observed that EL demonstrated acceptable criterion-related validity with variables that tap followers' job attitudes, job performance, and evaluations of their leaders. Further, followers' trust in the leader mediated the relationships of EL with job attitudes and performance. In terms of incremental validity, we found that EL significantly, albeit weakly in some cases, predicted task performance, citizenship behavior, and counterproductive work behavior-even after controlling for the effects of such variables as transformational leadership, use of contingent rewards, management by exception, interactional fairness, and destructive leadership. The article concludes with a discussion of ways to strengthen the incremental validity of EL. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  3. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    ERIC Educational Resources Information Center

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  4. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  5. Medication information leaflets for patients: the further validation of an analytic linguistic framework.

    PubMed

    Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle

    2009-01-01

    While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.

  6. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  7. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  8. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  9. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  10. Analytic Validation of Immunohistochemistry Assays: New Benchmark Data From a Survey of 1085 Laboratories.

    PubMed

    Stuart, Lauren N; Volmar, Keith E; Nowak, Jan A; Fatheree, Lisa A; Souers, Rhona J; Fitzgibbons, Patrick L; Goldsmith, Jeffrey D; Astles, J Rex; Nakhleh, Raouf E

    2017-09-01

    - A cooperative agreement between the College of American Pathologists (CAP) and the United States Centers for Disease Control and Prevention was undertaken to measure laboratories' awareness and implementation of an evidence-based laboratory practice guideline (LPG) on immunohistochemical (IHC) validation practices published in 2014. - To establish new benchmark data on IHC laboratory practices. - A 2015 survey on IHC assay validation practices was sent to laboratories subscribed to specific CAP proficiency testing programs and to additional nonsubscribing laboratories that perform IHC testing. Specific questions were designed to capture laboratory practices not addressed in a 2010 survey. - The analysis was based on responses from 1085 laboratories that perform IHC staining. Ninety-six percent (809 of 844) always documented validation of IHC assays. Sixty percent (648 of 1078) had separate procedures for predictive and nonpredictive markers, 42.7% (220 of 515) had procedures for laboratory-developed tests, 50% (349 of 697) had procedures for testing cytologic specimens, and 46.2% (363 of 785) had procedures for testing decalcified specimens. Minimum case numbers were specified by 85.9% (720 of 838) of laboratories for nonpredictive markers and 76% (584 of 768) for predictive markers. Median concordance requirements were 95% for both types. For initial validation, 75.4% (538 of 714) of laboratories adopted the 20-case minimum for nonpredictive markers and 45.9% (266 of 579) adopted the 40-case minimum for predictive markers as outlined in the 2014 LPG. The most common method for validation was correlation with morphology and expected results. Laboratories also reported which assay changes necessitated revalidation and their minimum case requirements. - Benchmark data on current IHC validation practices and procedures may help laboratories understand the issues and influence further refinement of LPG recommendations.

  11. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...

  12. Analytical modeling and experimental validation of a magnetorheological mount

    NASA Astrophysics Data System (ADS)

    Nguyen, The; Ciocanel, Constantin; Elahinia, Mohammad

    2009-03-01

    Magnetorheological (MR) fluid has been increasingly researched and applied in vibration isolation devices. To date, the suspension system of several high performance vehicles has been equipped with MR fluid based dampers and research is ongoing to develop MR fluid based mounts for engine and powertrain isolation. MR fluid based devices have received attention due to the MR fluid's capability to change its properties in the presence of a magnetic field. This characteristic places MR mounts in the class of semiactive isolators making them a desirable substitution for the passive hydraulic mounts. In this research, an analytical model of a mixed-mode MR mount was constructed. The magnetorheological mount employs flow (valve) mode and squeeze mode. Each mode is powered by an independent electromagnet, so one mode does not affect the operation of the other. The analytical model was used to predict the performance of the MR mount with different sets of parameters. Furthermore, in order to produce the actual prototype, the analytical model was used to identify the optimal geometry of the mount. The experimental phase of this research was carried by fabricating and testing the actual MR mount. The manufactured mount was tested to evaluate the effectiveness of each mode individually and in combination. The experimental results were also used to validate the ability of the analytical model in predicting the response of the MR mount. Based on the observed response of the mount a suitable controller can be designed for it. However, the control scheme is not addressed in this study.

  13. Microbial ecology laboratory procedures manual NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  14. Validation of biological activity testing procedure of recombinant human interleukin-7.

    PubMed

    Lutsenko, T N; Kovalenko, M V; Galkin, O Yu

    2017-01-01

    Validation procedure for method of monitoring the biological activity of reсombinant human interleukin-7 has been developed and conducted according to the requirements of national and international recommendations. This method is based on the ability of recombinant human interleukin-7 to induce proliferation of T lymphocytes. It has been shown that to control the biological activity of recombinant human interleukin-7 peripheral blood mononuclear cells (PBMCs) derived from blood or cell lines can be used. Validation charac­teristics that should be determined depend on the method, type of product or object test/measurement and biological test systems used in research. The validation procedure for the method of control of biological activity of recombinant human interleukin-7 in peripheral blood mononuclear cells showed satisfactory results on all parameters tested such as specificity, accuracy, precision and linearity.

  15. Analytical validation of a psychiatric pharmacogenomic test.

    PubMed

    Jablonski, Michael R; King, Nina; Wang, Yongbao; Winner, Joel G; Watterson, Lucas R; Gunselman, Sandra; Dechairo, Bryan M

    2018-05-01

    The aim of this study was to validate the analytical performance of a combinatorial pharmacogenomics test designed to aid in the appropriate medication selection for neuropsychiatric conditions. Genomic DNA was isolated from buccal swabs. Twelve genes (65 variants/alleles) associated with psychotropic medication metabolism, side effects, and mechanisms of actions were evaluated by bead array, MALDI-TOF mass spectrometry, and/or capillary electrophoresis methods (GeneSight Psychotropic, Assurex Health, Inc.). The combinatorial pharmacogenomics test has a dynamic range of 2.5-20 ng/μl of input genomic DNA, with comparable performance for all assays included in the test. Both the precision and accuracy of the test were >99.9%, with individual gene components between 99.4 and 100%. This study demonstrates that the combinatorial pharmacogenomics test is robust and reproducible, making it suitable for clinical use.

  16. Validating abortion procedure coding in Canadian administrative databases.

    PubMed

    Samiedaluie, Saied; Peterson, Sandra; Brant, Rollin; Kaczorowski, Janusz; Norman, Wendy V

    2016-07-12

    The British Columbia (BC) Ministry of Health collects abortion procedure data in the Medical Services Plan (MSP) physician billings database and in the hospital information Discharge Abstracts Database (DAD). Our study seeks to validate abortion procedure coding in these databases. Two randomized controlled trials enrolled a cohort of 1031 women undergoing abortion. The researcher collected database includes both enrollment and follow up chart review data. The study cohort was linked to MSP and DAD data to identify all abortions events captured in the administrative databases. We compared clinical chart data on abortion procedures with health administrative data. We considered a match to occur if an abortion related code was found in administrative data within 30 days of the date of the same event documented in a clinical chart. Among 1158 abortion events performed during enrollment and follow-up period, 99.1 % were found in at least one of the administrative data sources. The sensitivities for the two databases, evaluated using a gold standard, were 97.7 % (95 % confidence interval (CI): 96.6-98.5) for the MSP database and 91.9 % (95 % CI: 90.0-93.4) for the DAD. Abortion events coded in the BC health administrative databases are highly accurate. Single-payer health administrative databases at the provincial level in Canada have the potential to offer valid data reflecting abortion events. ClinicalTrials.gov Identifier NCT01174225 , Current Controlled Trials ISRCTN19506752 .

  17. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) General Principles § 60-3.6 Use of selection procedures which have not been validated. A. Use of alternate... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Use of selection procedures which have not been validated. 60-3.6 Section 60-3.6 Public Contracts and Property Management...

  18. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence

  19. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  20. Development and validation of a fast and simple multi-analyte procedure for quantification of 40 drugs relevant to emergency toxicology using GC-MS and one-point calibration.

    PubMed

    Meyer, Golo M J; Weber, Armin A; Maurer, Hans H

    2014-05-01

    Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B

  2. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    ERIC Educational Resources Information Center

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  3. Validation of the SINDA/FLUINT code using several analytical solutions

    NASA Technical Reports Server (NTRS)

    Keller, John R.

    1995-01-01

    The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.

  4. Validation of urban freeway models. [supporting datasets

    DOT National Transportation Integrated Search

    2015-01-01

    The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...

  5. Thermal conductivity of microporous layers: Analytical modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Andisheh-Tadbir, Mehdi; Kjeang, Erik; Bahrami, Majid

    2015-11-01

    A new compact relationship is developed for the thermal conductivity of the microporous layer (MPL) used in polymer electrolyte fuel cells as a function of pore size distribution, porosity, and compression pressure. The proposed model is successfully validated against experimental data obtained from a transient plane source thermal constants analyzer. The thermal conductivities of carbon paper samples with and without MPL were measured as a function of load (1-6 bars) and the MPL thermal conductivity was found between 0.13 and 0.17 W m-1 K-1. The proposed analytical model predicts the experimental thermal conductivities within 5%. A correlation generated from the analytical model was used in a multi objective genetic algorithm to predict the pore size distribution and porosity for an MPL with optimized thermal conductivity and mass diffusivity. The results suggest that an optimized MPL, in terms of heat and mass transfer coefficients, has an average pore size of 122 nm and 63% porosity.

  6. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    PubMed

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  7. Development and Validation of a Novel Robotic Procedure Specific Simulation Platform: Partial Nephrectomy.

    PubMed

    Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S

    2015-08-01

    We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific

  8. Validation of Short-Term Noise Assessment Procedures: FY16 Summary of Procedures, Progress, and Preliminary Results

    DTIC Science & Technology

    Validation project. This report describes the procedure used to generate the noise models output dataset , and then it compares that dataset to the...benchmark, the Engineer Research and Development Centers Long-Range Sound Propagation dataset . It was found that the models consistently underpredict the

  9. Validity, reliability and support for implementation of independence-scaled procedural assessment in laparoscopic surgery.

    PubMed

    Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N

    2016-06-01

    There is no widely used method to evaluate procedure-specific laparoscopic skills. The first aim of this study was to develop a procedure-based assessment method. The second aim was to compare its validity, reliability and feasibility with currently available global rating scales (GRSs). An independence-scaled procedural assessment was created by linking the procedural key steps of the laparoscopic cholecystectomy to an independence scale. Subtitled and blinded videos of a novice, an intermediate and an almost competent trainee, were evaluated with GRSs (OSATS and GOALS) and the independence-scaled procedural assessment by seven surgeons, three senior trainees and six scrub nurses. Participants received a short introduction to the GRSs and independence-scaled procedural assessment before assessment. The validity was estimated with the Friedman and Wilcoxon test and the reliability with the intra-class correlation coefficient (ICC). A questionnaire was used to evaluate user opinion. Independence-scaled procedural assessment and GRS scores improved significantly with surgical experience (OSATS p = 0.001, GOALS p < 0.001, independence-scaled procedural assessment p < 0.001). The ICCs of the OSATS, GOALS and independence-scaled procedural assessment were 0.78, 0.74 and 0.84, respectively, among surgeons. The ICCs increased when the ratings of scrub nurses were added to those of the surgeons. The independence-scaled procedural assessment was not considered more of an administrative burden than the GRSs (p = 0.692). A procedural assessment created by combining procedural key steps to an independence scale is a valid, reliable and acceptable assessment instrument in surgery. In contrast to the GRSs, the reliability of the independence-scaled procedural assessment exceeded the threshold of 0.8, indicating that it can also be used for summative assessment. It furthermore seems that scrub nurses can assess the operative competence of surgical trainees.

  10. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  11. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  12. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  13. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  14. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  15. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  16. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of

  17. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    PubMed

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  18. Validation of a multi-analyte panel with cell-bound complement activation products for systemic lupus erythematosus.

    PubMed

    Dervieux, Thierry; Conklin, John; Ligayon, Jo-Anne; Wolover, Leilani; O'Malley, Tyler; Alexander, Roberta Vezza; Weinstein, Arthur; Ibarra, Claudia A

    2017-07-01

    We describe the analytical validation of an assay panel intended to assist clinicians with the diagnosis of systemic lupus erythematosus (SLE). The multi-analyte panel includes quantitative assessment of complement activation and measurement of autoantibodies. The levels of the complement split product C4d bound to erythrocytes (EC4d) and B-lymphocytes (BC4d) (expressed as mean fluorescence intensity [MFI]) are measured by quantitative flow cytometry, while autoantibodies (inclusive of antinuclear and anti-double stranded DNA antibodies) are determined by immunoassays. Results of the multi-analyte panel are reported as positive or negative based on a 2-tiered index score. Post-phlebotomy stability of EC4d and BC4d in EDTA-anticoagulated blood is determined using specimens collected from patients with SLE and normal donors. Three-level C4 coated positive beads are run daily as controls. Analytical validity is reported using intra-day and inter-day coefficient of variation (CV). EC4d and BC4d are stable for 2days at ambient temperature and for 4days at 4°C post-phlebotomy. Median intra-day and inter-day CV range from 2.9% to 7.8% (n=30) and 7.3% to 12.4% (n=66), respectively. The 2-tiered index score is reproducible over 4 consecutive daysupon storage of blood at 4°C. A total of 2,888 three-level quality control data were collected from 6 flow cytometers with an overall failure rate below 3%. Median EC4d level is 6 net MFI (Interquartile [IQ] range 4-9 net MFI) and median BC4d is 18 net MFI (IQ range 13-27 net MFI) among 86,852 specimens submitted for testing. The incidence of 2-tiered positive test results is 13.4%. We have established the analytical validity of a multi-analyte assay panel for SLE. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less

  20. Structurally compliant rocket engine combustion chamber: Experimental and analytical validation

    NASA Technical Reports Server (NTRS)

    Jankovsky, Robert S.; Arya, Vinod K.; Kazaroff, John M.; Halford, Gary R.

    1994-01-01

    A new, structurally compliant rocket engine combustion chamber design has been validated through analysis and experiment. Subscale, tubular channel chambers have been cyclically tested and analytically evaluated. Cyclic lives were determined to have a potential for 1000 percent increase over those of rectangular channel designs, the current state of the art. Greater structural compliance in the circumferential direction gave rise to lower thermal strains during hot firing, resulting in lower thermal strain ratcheting and longer predicted fatigue lives. Thermal, structural, and durability analyses of the combustion chamber design, involving cyclic temperatures, strains, and low-cycle fatigue lives, have corroborated the experimental observations.

  1. Development of analytical procedures for the determination of hexavalent chromium in corrosion prevention coatings used in the automotive industry.

    PubMed

    Séby, F; Castetbon, A; Ortega, R; Guimon, C; Niveau, F; Barrois-Oudin, N; Garraud, H; Donard, O F X

    2008-05-01

    The European directive 2000/53/EC limits the use of Cr(VI) in vehicle manufacturing. Although a maximum of 2 g of Cr(VI) was authorised per vehicle for corrosion prevention coatings of key components, since July 2007 its use has been prohibited except for some particular applications. Therefore, the objective of this work was to develop direct analytical procedures for Cr(VI) determination in the different steel coatings used for screws. Instead of working directly with screws, the optimisation of the procedures was carried out with metallic plates homogeneously coated to improve the data comparability. Extraction of Cr(VI) from the metallic parts was performed by sonication. Two extraction solutions were tested: a direct water extraction solution used in standard protocols and an ammonium/ammonia buffer solution at pH 8.9. The extracts were further analysed for Cr speciation by high-performance liquid chromatography (HPLC) inductively coupled plasma (ICP) atomic emission spectrometry or HPLC ICP mass spectrometry depending on the concentration level. When possible, the coatings were also directly analysed by solid speciation techniques (X-ray photoelectron spectroscopy, XPS, and X-ray absorption near-edge structure, XANES) for validation of the results. Very good results between the different analytical approaches were obtained for the sample of coating made up of a heated paint containing Zn, Al and Cr when using the extracting buffer solution at pH 8.9. After a repeated four-step extraction procedure on the same portion test, taking into account the depth of the surface layer reached, good agreement with XPS and XANES results was obtained. In contrast, for the coatings composed of an alkaline Zn layer where Cr(VI) and Cr(III) are deposited, only the extraction procedure using water allowed the detection of Cr(VI). To elucidate the Cr(VI) reduction during extraction at pH 8.9, the reactivity of Cr(VI) towards different species of Zn generally present in the

  2. Optimisation and validation of the analytical procedure for the determination of acrylamide in coffee by LC-MS/MS with SPE clean up.

    PubMed

    Gielecińska, Iwona; Mojska, Hanna

    2013-01-01

    Numerous studies have demonstrated acrylamide to be both neurotoxic and carcinogenic. At present it is widely recognised that acrylamide is mainly formed through the Maillard reaction from free asparagine and reducing sugars. The major sources of dietary acrylamide are potato products, processed cereals and coffee. To optimise and validate an analytical method for determining acrylamide in coffee by liquid chromatography and tandem mass spectrometry analysis (LC/MS/MS) using SPE clean-up. Analytical separation of acrylamide from roasted coffee was performed by liquid chromatography using a Hypercarb column followed by LC/MS/MS analysis, with 2,3,3-d3 acrylamide as an internal standard. The method was based on two purification steps: the first with hexane and Carrez solutions in order to remove of fat and to precipitate proteins, respectively; and the second with a solid-phase extraction (SPE) column which proved to be efficient in the elimination of the main chromatographic interferences. Limit of quantification (LOQ) for measuring acrylamide in coffee was 50 microg/kg. The described method demonstrates satisfactory precision (RSD = 2.5%), repeatability (RSD = 9.2%) and accuracy (mean recovery - 97.4%). Our results confirm that LC-MS/MS with SPE clean-up is selective and suitable for determination of acrylamide in coffee. Indeed, this method meets the criteria of EU Commission Recommendations (No. 2007/331/EC and No. 2010/307/EU), on the monitoring of acrylamide levels in food.

  3. Analysis of Carbamate Pesticides: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS666

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less

  4. Analysis of Phosphonic Acids: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled Analysis of Diisopropyl Methylphosphonate, Ethyl Hydrogen Dimethylamidophosphate, Isopropyl Methylphosphonic Acid, Methylphosphonic Acid, and Pinacolyl Methylphosphonic Acid in Water by Multiple Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry: EPA Version MS999. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures describedmore » in EPA Method MS999 for analysis of the listed phosphonic acids and surrogates in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of EPA Method MS999 can be determined.« less

  5. Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.

    PubMed

    Blake, Christopher J

    2007-09-01

    Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.

  6. Analytical validation of a new point-of-care assay for serum amyloid A in horses.

    PubMed

    Schwartz, D; Pusterla, N; Jacobsen, S; Christopher, M M

    2018-01-17

    Serum amyloid A (SAA) is a major acute phase protein in horses. A new point-of-care (POC) test for SAA (Stablelab) is available, but studies evaluating its analytical accuracy are lacking. To evaluate the analytical performance of the SAA POC test by 1) determining linearity and precision, 2) comparing results in whole blood with those in serum or plasma, and 3) comparing POC results with those obtained using a previously validated turbidimetric immunoassay (TIA). Assay validation. Analytical validation of the POC test was done in accordance with American Society of Veterinary Clinical Pathology guidelines using residual equine serum/plasma and whole blood samples from the Clinical Pathology Laboratory at the University of California-Davis. A TIA was used as the reference method. We also evaluated the effect of haematocrit (HCT). The POC test was linear for SAA concentrations of up to at least 1000 μg/mL (r = 0.991). Intra-assay CVs were 13, 18 and 15% at high (782 μg/mL), intermediate (116 μg/mL) and low (64 μg/mL) concentrations. Inter-assay (inter-batch) CVs were 45, 14 and 15% at high (1372 μg/mL), intermediate (140 μg/mL) and low (56 μg/mL) concentrations. SAA results in whole blood were significantly lower than those in serum/plasma (P = 0.0002), but were positively correlated (r = 0.908) and not affected by HCT (P = 0.261); proportional negative bias was observed in samples with SAA>500 μg/mL. The difference between methods exceeded the 95% confidence interval of the combined imprecision of both methods (15%). Analytical validation could not be performed in whole blood, the sample most likely to be used stall side. The POC test has acceptable accuracy and precision in equine serum/plasma with SAA concentrations of up to at least 1000 μg/mL. Low inter-batch precision at high concentrations may affect serial measurements, and the use of the same test batch and sample type (serum/plasma or whole blood) is recommended. Comparison of results between the

  7. Validation of Procedures for Monitoring Crewmember Immune Function - Short Duration Biological Investigation

    NASA Technical Reports Server (NTRS)

    Sams, Clarence; Crucian, Brian; Stowe, Raymond; Pierson, Duane; Mehta, Satish; Morukov, Boris; Uchakin, Peter; Nehlsen-Cannarella, Sandra

    2008-01-01

    Validation of Procedures for Monitoring Crew Member Immune Function - Short Duration Biological Investigation (Integrated Immune-SDBI) will assess the clinical risks resulting from the adverse effects of space flight on the human immune system and will validate a flightcompatible immune monitoring strategy. Immune system changes will be monitored by collecting and analyzing blood, urine and saliva samples from crewmembers before, during and after space flight.

  8. Validation of the enthalpy method by means of analytical solution

    NASA Astrophysics Data System (ADS)

    Kleiner, Thomas; Rückamp, Martin; Bondzio, Johannes; Humbert, Angelika

    2014-05-01

    Numerical simulations moved in the recent year(s) from describing the cold-temperate transition surface (CTS) towards an enthalpy description, which allows avoiding incorporating a singular surface inside the model (Aschwanden et al., 2012). In Enthalpy methods the CTS is represented as a level set of the enthalpy state variable. This method has several numerical and practical advantages (e.g. representation of the full energy by one scalar field, no restriction to topology and shape of the CTS). The proposed method is rather new in glaciology and to our knowledge not verified and validated against analytical solutions. Unfortunately we are still lacking analytical solutions for sufficiently complex thermo-mechanically coupled polythermal ice flow. However, we present two experiments to test the implementation of the enthalpy equation and corresponding boundary conditions. The first experiment tests particularly the functionality of the boundary condition scheme and the corresponding basal melt rate calculation. Dependent on the different thermal situations that occur at the base, the numerical code may have to switch to another boundary type (from Neuman to Dirichlet or vice versa). The main idea of this set-up is to test the reversibility during transients. A former cold ice body that run through a warmer period with an associated built up of a liquid water layer at the base must be able to return to its initial steady state. Since we impose several assumptions on the experiment design analytical solutions can be formulated for different quantities during distinct stages of the simulation. The second experiment tests the positioning of the internal CTS in a parallel-sided polythermal slab. We compare our simulation results to the analytical solution proposed by Greve and Blatter (2009). Results from three different ice flow-models (COMIce, ISSM, TIMFD3) are presented.

  9. Differences in metabolite profiles caused by pre-analytical blood processing procedures.

    PubMed

    Nishiumi, Shin; Suzuki, Makoto; Kobayashi, Takashi; Yoshida, Masaru

    2018-05-01

    Recently, the use of metabolomic analysis of human serum and plasma for biomarker discovery and disease diagnosis in clinical studies has been increasing. The feasibility of using a metabolite biomarker for disease diagnosis is strongly dependent on the metabolite's stability during pre-analytical blood processing procedures, such as serum or plasma sampling and sample storage prior to centrifugation. However, the influence of blood processing procedures on the stability of metabolites has not been fully characterized. In the present study, we compared the levels of metabolites in matched human serum and plasma samples using gas chromatography coupled with mass spectrometry and liquid chromatography coupled with mass spectrometry. In addition, we evaluated the changes in plasma metabolite levels induced by storage at room temperature or at a cold temperature prior to centrifugation. As a result, it was found that 76 metabolites exhibited significant differences between their serum and plasma levels. Furthermore, the pre-centrifugation storage conditions significantly affected the plasma levels of 45 metabolites. These results highlight the importance of blood processing procedures during metabolome analysis, which should be considered during biomarker discovery and the subsequent use of biomarkers for disease diagnosis. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  10. Comparative Validity of the Shedler and Westen Assessment Procedure-200

    ERIC Educational Resources Information Center

    Mullins-Sweatt, Stephanie N.; Widiger, Thomas A.

    2008-01-01

    A predominant dimensional model of general personality structure is the five-factor model (FFM). Quite a number of alternative instruments have been developed to assess the domains of the FFM. The current study compares the validity of 2 alternative versions of the Shedler and Westen Assessment Procedure (SWAP-200) FFM scales, 1 that was developed…

  11. Laboratory, Field, and Analytical Procedures for Using ...

    EPA Pesticide Factsheets

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas

  12. Characterization and validation of sampling and analytical methods for mycotoxins in workplace air.

    PubMed

    Jargot, Danièle; Melin, Sandrine

    2013-03-01

    Mycotoxins are produced by certain plant or foodstuff moulds under growing, transport or storage conditions. They are toxic for humans and animals, some are carcinogenic. Methods to monitor occupational exposure to seven of the most frequently occurring airborne mycotoxins have been characterized and validated. Experimental aerosols have been generated from naturally contaminated particles for sampler evaluation. Air samples were collected on foam pads, using the CIP 10 personal aerosol sampler with its inhalable health-related aerosol fraction selector. The samples were subsequently solvent extracted from the sampling media, cleaned using immunoaffinity (IA) columns and analyzed by liquid chromatography with fluorescence detection. Ochratoxin A (OTA) or fumonisin and aflatoxin derivatives were detected and quantified. The quantification limits were 0.015 ng m(-3) OTA, 1 ng m(-3) fumonisins or 0.5 pg m(-3) aflatoxins, with a minimum dust concentration level of 1 mg m(-3) and a 4800 L air volume sampling. The methods were successfully applied to field measurements, which confirmed that workers could be exposed when handling contaminated materials. It was observed that airborne particles may be more contaminated than the bulk material itself. The validated methods have measuring ranges fully adapted to the concentrations found in the workplace. Their performance meets the general requirements laid down for chemical agent measurement procedures, with an expanded uncertainty less than 50% for most mycotoxins. The analytical uncertainty, comprised between 14 and 24%, was quite satisfactory given the low mycotoxin amounts, when compared to the food benchmarks. The methods are now user-friendly enough to be adopted for personal workplace sampling. They will later allow for mycotoxin occupational risk assessment, as only very few quantitative data have been available till now.

  13. Fluid dynamics of coarctation of the aorta: analytical solution, in vitro validation and in vivo evaluation

    NASA Astrophysics Data System (ADS)

    Keshavarz-Motamed, Zahra

    2015-11-01

    Coarctation of the aorta (COA) is a congenital heart disease corresponding to a narrowing in the aorta. Cardiac catheterization is considered to be the reference standard for definitive evaluation of COA severity, based on the peak-to-peak trans-coarctation pressure gradient (PtoP TCPG) and instantaneous systolic value of trans-COA pressure gradient (TCPG). However, invasive cardiac catheterization may carry high risks given that undergoing multiple follow-up cardiac catheterizations in patients with COA is common. The objective of this study is to present an analytical description of the COA that estimates PtoP TCPG and TCPG without a need for high risk invasive data collection. Coupled Navier-Stokes and elastic deformation equations were solved analytically to estimate TCPG and PtoP TCPG. The results were validated against data measured in vitro (e.g., 90% COA: TCPG: root mean squared error (RMSE) = 3.93 mmHg; PtoP TCPG: RMSE = 7.9 mmHg). Moreover, the estimated PtoP TCPG resulted from the suggested analytical description was validated using clinical data in twenty patients with COA (maximum RMSE: 8.3 mmHg). Very good correlation and concordance were found between TCPG and PtoP TCPG obtained from the analytical formulation and in vitro and in vivo data. The suggested methodology can be considered as an alternative to cardiac catheterization and can help preventing its risks.

  14. Fast analytical model of MZI micro-opto-mechanical pressure sensor

    NASA Astrophysics Data System (ADS)

    Rochus, V.; Jansen, R.; Goyvaerts, J.; Neutens, P.; O’Callaghan, J.; Rottenberg, X.

    2018-06-01

    This paper presents a fast analytical procedure in order to design a micro-opto-mechanical pressure sensor (MOMPS) taking into account the mechanical nonlinearity and the optical losses. A realistic model of the photonic MZI is proposed, strongly coupled to a nonlinear mechanical model of the membrane. Based on the membrane dimensions, the residual stress, the position of the waveguide, the optical wavelength and the phase variation due to the opto-mechanical coupling, we derive an analytical model which allows us to predict the response of the total system. The effect of the nonlinearity and the losses on the total performance are carefully studied and measurements on fabricated devices are used to validate the model. Finally, a design procedure is proposed in order to realize fast design of this new type of pressure sensor.

  15. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  16. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling.

    PubMed

    Plagnol, Vincent; Woodhouse, Samuel; Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg; Forshew, Tim

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications.

  17. Analytical validation of a next generation sequencing liquid biopsy assay for high sensitivity broad molecular profiling

    PubMed Central

    Howarth, Karen; Lensing, Stefanie; Smith, Matt; Epstein, Michael; Madi, Mikidache; Smalley, Sarah; Leroy, Catherine; Hinton, Jonathan; de Kievit, Frank; Musgrave-Brown, Esther; Herd, Colin; Baker-Neblett, Katherine; Brennan, Will; Dimitrov, Peter; Campbell, Nathan; Morris, Clive; Rosenfeld, Nitzan; Clark, James; Gale, Davina; Platt, Jamie; Calaway, John; Jones, Greg

    2018-01-01

    Circulating tumor DNA (ctDNA) analysis is being incorporated into cancer care; notably in profiling patients to guide treatment decisions. Responses to targeted therapies have been observed in patients with actionable mutations detected in plasma DNA at variant allele fractions (VAFs) below 0.5%. Highly sensitive methods are therefore required for optimal clinical use. To enable objective assessment of assay performance, detailed analytical validation is required. We developed the InVisionFirst™ assay, an assay based on enhanced tagged amplicon sequencing (eTAm-Seq™) technology to profile 36 genes commonly mutated in non-small cell lung cancer (NSCLC) and other cancer types for actionable genomic alterations in cell-free DNA. The assay has been developed to detect point mutations, indels, amplifications and gene fusions that commonly occur in NSCLC. For analytical validation, two 10mL blood tubes were collected from NSCLC patients and healthy volunteer donors. In addition, contrived samples were used to represent a wide spectrum of genetic aberrations and VAFs. Samples were analyzed by multiple operators, at different times and using different reagent Lots. Results were compared with digital PCR (dPCR). The InVisionFirst assay demonstrated an excellent limit of detection, with 99.48% sensitivity for SNVs present at VAF range 0.25%-0.33%, 92.46% sensitivity for indels at 0.25% VAF and a high rate of detection at lower frequencies while retaining high specificity (99.9997% per base). The assay also detected ALK and ROS1 gene fusions, and DNA amplifications in ERBB2, FGFR1, MET and EGFR with high sensitivity and specificity. Comparison between the InVisionFirst assay and dPCR in a series of cancer patients showed high concordance. This analytical validation demonstrated that the InVisionFirst assay is highly sensitive, specific and robust, and meets analytical requirements for clinical applications. PMID:29543828

  18. An improved procedure for the validation of satellite-based precipitation estimates

    NASA Astrophysics Data System (ADS)

    Tang, Ling; Tian, Yudong; Yan, Fang; Habib, Emad

    2015-09-01

    The objective of this study is to propose and test a new procedure to improve the validation of remote-sensing, high-resolution precipitation estimates. Our recent studies show that many conventional validation measures do not accurately capture the unique error characteristics in precipitation estimates to better inform both data producers and users. The proposed new validation procedure has two steps: 1) an error decomposition approach to separate the total retrieval error into three independent components: hit error, false precipitation and missed precipitation; and 2) the hit error is further analyzed based on a multiplicative error model. In the multiplicative error model, the error features are captured by three model parameters. In this way, the multiplicative error model separates systematic and random errors, leading to more accurate quantification of the uncertainties. The proposed procedure is used to quantitatively evaluate the recent two versions (Version 6 and 7) of TRMM's Multi-sensor Precipitation Analysis (TMPA) real-time and research product suite (3B42 and 3B42RT) for seven years (2005-2011) over the continental United States (CONUS). The gauge-based National Centers for Environmental Prediction (NCEP) Climate Prediction Center (CPC) near-real-time daily precipitation analysis is used as the reference. In addition, the radar-based NCEP Stage IV precipitation data are also model-fitted to verify the effectiveness of the multiplicative error model. The results show that winter total bias is dominated by the missed precipitation over the west coastal areas and the Rocky Mountains, and the false precipitation over large areas in Midwest. The summer total bias is largely coming from the hit bias in Central US. Meanwhile, the new version (V7) tends to produce more rainfall in the higher rain rates, which moderates the significant underestimation exhibited in the previous V6 products. Moreover, the error analysis from the multiplicative error model

  19. Towards a full integration of optimization and validation phases: An analytical-quality-by-design approach.

    PubMed

    Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph

    2015-05-22

    When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved

  20. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  1. Validation of an Analytical Method for the Determination of Pesticide Residues in Vine Leaves by GC-MS/MS.

    PubMed

    Maestroni, Britt; Abu Alnaser, Amer; Ghanem, Iyad; Islam, Marivil; Cesio, Veronica; Heinzen, Horacio; Kelly, Simon; Cannavan, Andrew

    2018-06-27

    A modified and miniaturized SweEt/QuEChERS method for pesticide residue analysis applied to vine leaves is presented. The deep-frozen plant material was cryogenically processed. A 2 g analytical portion was hydrated for 30 min and extracted with acidified ethyl acetate after buffering with NaHCO 3 and adding Na 2 SO 4 . A dispersive solid-phase (d-SPE) cleanup step with primary-secondary amine (PSA) was performed. The pesticide residues were determined using GC-MS/MS. The whole procedure was validated for 54-59 pesticides at 0.01, 0.02, 0.1, and 0.2 mg/kg in fresh vine leaves ( Vitis vinifera). The key method performance parameters investigated were specificity, linearity, trueness, within laboratory repeatability and reproducibility, limit of detection, limit of quantitation, and matrix effects. Recoveries for the 59 pesticides tested ranged from 60 to 110%, and the RSDs were lower than 20% for the majority of the pesticides studied.

  2. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  3. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    PubMed Central

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931

  4. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique.

    PubMed

    Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8  µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  5. Development and validation of a screening procedure to identify speech-language delay in toddlers with cleft palate.

    PubMed

    Jørgensen, Line Dahl; Willadsen, Elisabeth

    2017-01-01

    The purpose of this study was to develop and validate a clinically useful speech-language screening procedure for young children with cleft palate ± cleft lip (CP) to identify those in need of speech-language intervention. Twenty-two children with CP were assigned to a +/- need for intervention conditions based on assessment of consonant inventory using a real-time listening procedure in combination with parent-reported expressive vocabulary. These measures allowed evaluation of early speech-language skills found to correlate significantly with later speech-language performance in longitudinal studies of children with CP. The external validity of this screening procedure was evaluated by comparing the +/- need for intervention assignment determined by the screening procedure to experienced speech-language pathologist (SLP)s' clinical judgement of whether or not a child needed early intervention. The results of real-time listening assessment showed good-excellent inter-rater agreement on different consonant inventory measures. Furthermore, there was almost perfect agreement between the children selected for intervention with the screening procedure and the clinical judgement of experienced SLPs indicate that the screening procedure is a valid way of identifying children with CP who need early intervention.

  6. Validation of an analytical method for simultaneous high-precision measurements of greenhouse gas emissions from wastewater treatment plants using a gas chromatography-barrier discharge detector system.

    PubMed

    Pascale, Raffaella; Caivano, Marianna; Buchicchio, Alessandro; Mancini, Ignazio M; Bianco, Giuliana; Caniani, Donatella

    2017-01-13

    Wastewater treatment plants (WWTPs) emit CO 2 and N 2 O, which may lead to climate change and global warming. Over the last few years, awareness of greenhouse gas (GHG) emissions from WWTPs has increased. Moreover, the development of valid, reliable, and high-throughput analytical methods for simultaneous gas analysis is an essential requirement for environmental applications. In the present study, an analytical method based on a gas chromatograph (GC) equipped with a barrier ionization discharge (BID) detector was developed for the first time. This new method simultaneously analyses CO 2 and N 2 O and has a precision, measured in terms of relative standard of variation RSD%, equal to or less than 6.6% and 5.1%, respectively. The method's detection limits are 5.3ppm v for CO 2 and 62.0ppb v for N 2 O. The method's selectivity, linearity, accuracy, repeatability, intermediate precision, limit of detection and limit of quantification were good at trace concentration levels. After validation, the method was applied to a real case of N 2 O and CO 2 emissions from a WWTP, confirming its suitability as a standard procedure for simultaneous GHG analysis in environmental samples containing CO 2 levels less than 12,000mg/L. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Validity evidence for procedural competency in virtual reality robotic simulation, establishing a credible pass/fail standard for the vaginal cuff closure procedure.

    PubMed

    Hovgaard, Lisette Hvid; Andersen, Steven Arild Wuyts; Konge, Lars; Dalsgaard, Torur; Larsen, Christian Rifbjerg

    2018-03-30

    The use of robotic surgery for minimally invasive procedures has increased considerably over the last decade. Robotic surgery has potential advantages compared to laparoscopic surgery but also requires new skills. Using virtual reality (VR) simulation to facilitate the acquisition of these new skills could potentially benefit training of robotic surgical skills and also be a crucial step in developing a robotic surgical training curriculum. The study's objective was to establish validity evidence for a simulation-based test for procedural competency for the vaginal cuff closure procedure that can be used in a future simulation-based, mastery learning training curriculum. Eleven novice gynaecological surgeons without prior robotic experience and 11 experienced gynaecological robotic surgeons (> 30 robotic procedures) were recruited. After familiarization with the VR simulator, participants completed the module 'Guided Vaginal Cuff Closure' six times. Validity evidence was investigated for 18 preselected simulator metrics. The internal consistency was assessed using Cronbach's alpha and a composite score was calculated based on metrics with significant discriminative ability between the two groups. Finally, a pass/fail standard was established using the contrasting groups' method. The experienced surgeons significantly outperformed the novice surgeons on 6 of the 18 metrics. The internal consistency was 0.58 (Cronbach's alpha). The experienced surgeons' mean composite score for all six repetitions were significantly better than the novice surgeons' (76.1 vs. 63.0, respectively, p < 0.001). A pass/fail standard of 75/100 was established. Four novice surgeons passed this standard (false positives) and three experienced surgeons failed (false negatives). Our study has gathered validity evidence for a simulation-based test for procedural robotic surgical competency in the vaginal cuff closure procedure and established a credible pass/fail standard for future

  8. 77 FR 39895 - New Analytic Methods and Sampling Procedures for the United States National Residue Program for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...

  9. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  10. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  11. Rational quality assessment procedure for less-investigated herbal medicines: Case of a Congolese antimalarial drug with an analytical report.

    PubMed

    Tshitenge, Dieudonné Tshitenge; Ioset, Karine Ndjoko; Lami, José Nzunzu; Ndelo-di-Phanzu, Josaphat; Mufusama, Jean-Pierre Koy Sita; Bringmann, Gerhard

    2016-04-01

    Herbal medicines are the most globally used type of medical drugs. Their high cultural acceptability is due to the experienced safety and efficiency over centuries of use. Many of them are still phytochemically less-investigated, and are used without standardization or quality control. Choosing SIROP KILMA, an authorized Congolese antimalarial phytomedicine, as a model case, our study describes an interdisciplinary approach for a rational quality assessment of herbal drugs in general. It combines an authentication step of the herbal remedy prior to any fingerprinting, the isolation of the major constituents, the development and validation of an HPLC-DAD analytical method with internal markers, and the application of the method to several batches of the herbal medicine (here KILMA) thus permitting the establishment of a quantitative fingerprint. From the constitutive plants of KILMA, acteoside, isoacteoside, stachannin A, and pectolinarigenin-7-O-glucoside were isolated, and acteoside was used as the prime marker for the validation of an analytical method. This study contributes to the efforts of the WHO for the establishment of standards enabling the analytical evaluation of herbal materials. Moreover, the paper describes the first phytochemical and analytical report on a marketed Congolese phytomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Development of an analytical procedure to study linear alkylbenzenesulphonate (LAS) degradation in sewage sludge-amended soils.

    PubMed

    Comellas, L; Portillo, J L; Vaquero, M T

    1993-12-24

    A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.

  13. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  14. Validation of a liquid chromatography-tandem mass spectrometry method for the detection of nicotine biomarkers in hair and an evaluation of wash procedures for removal of environmental nicotine.

    PubMed

    Miller, Eleanor I; Murray, Gordon J; Rollins, Douglas E; Tiffany, Stephen T; Wilkins, Diana G

    2011-07-01

    The aim of this exploratory study was to develop and validate a liquid chromatography-tandem mass spectrometry (LC-MS-MS) method for the quantification of nicotine, eight nicotine metabolites, and two minor tobacco alkaloids in fortified analyte-free hair and subsequently apply this method to hair samples collected from active smokers. An additional aim of the study was to include an evaluation of different wash procedures for the effective removal of environmentally deposited nicotine from tobacco smoke. An apparatus was designed for the purpose of exposing analyte-free hair to environmental tobacco smoke in order to deposit nicotine onto the hair surface. A shampoo/water wash procedure was identified as the most effective means of removing nicotine. This wash procedure was utilized for a comparison of washed and unwashed heavy smoker hair samples. Analytes and corresponding deuterated internal standards were extracted using a cation-exchange solid-phase cartridge. LC-MS-MS was carried out using an Acquity™ UPLC(®) system (Waters) and a Quattro Premier XE™ triple quadrupole MS (Waters) operated in electrospray positive ionization mode, with multiple reaction monitoring data acquisition. The developed method was applied to hair samples collected from heavy smokers (n = 3) and low-level smokers (n = 3) collected through IRB-approved protocols. Nicotine, cotinine, and nornicotine were quantified in both the washed and unwashed hair samples collected from three heavy smokers, whereas 3-hydroxycotinine was quantified in only one unwashed sample and nicotine-1'-oxide in the washed and unwashed hair samples from two heavy smokers. In contrast, nicotine-1'-oxide was quantified in one of the three low-level smoker samples; nicotine was quantified in the other two low-level smoker samples. No other analytes were detected in the hair of the three low-level smokers.

  15. DEVELOPMENT AND VALIDATION OF ANALYTICAL METHODS FOR ENUMERATION OF FECAL INDICATORS AND EMERGING CHEMICAL CONTAMINANTS IN BIOSOLIDS

    EPA Science Inventory

    In 2002 the National Research Council (NRC) issued a report which identified a number of issues regarding biosolids land application practices and pointed out the need for improved and validated analytical techniques for regulated indicator organisms and pathogens. They also call...

  16. Differences in Coping Styles among Persons with Spinal Cord Injury: A Cluster-Analytic Approach.

    ERIC Educational Resources Information Center

    Frank, Robert G.; And Others

    1987-01-01

    Identified and validated two subgroups in group of 53 persons with spinal cord injury by applying cluster-analytic procedures to subjects' self-reported coping and health locus of control belief scores. Cluster 1 coped less effectively and tended to be psychologically distressed; Cluster 2 subjects emphasized internal health attributions and…

  17. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  18. Multiple piezo-patch energy harvesters integrated to a thin plate with AC-DC conversion: analytical modeling and numerical validation

    NASA Astrophysics Data System (ADS)

    Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper

    2016-04-01

    Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.

  19. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs

  20. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  1. Pulsed plane wave analytic solutions for generic shapes and the validation of Maxwell's equations solvers

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Vastano, John A.; Lomax, Harvard

    1992-01-01

    Generic shapes are subjected to pulsed plane waves of arbitrary shape. The resulting scattered electromagnetic fields are determined analytically. These fields are then computed efficiently at field locations for which numerically determined EM fields are required. Of particular interest are the pulsed waveform shapes typically utilized by radar systems. The results can be used to validate the accuracy of finite difference time domain Maxwell's equations solvers. A two-dimensional solver which is second- and fourth-order accurate in space and fourth-order accurate in time is examined. Dielectric media properties are modeled by a ramping technique which simplifies the associated gridding of body shapes. The attributes of the ramping technique are evaluated by comparison with the analytic solutions.

  2. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  3. Fluorescence In Situ Hybridization Probe Validation for Clinical Use.

    PubMed

    Gu, Jun; Smith, Janice L; Dowling, Patricia K

    2017-01-01

    In this chapter, we provide a systematic overview of the published guidelines and validation procedures for fluorescence in situ hybridization (FISH) probes for clinical diagnostic use. FISH probes-which are classified as molecular probes or analyte-specific reagents (ASRs)-have been extensively used in vitro for both clinical diagnosis and research. Most commercially available FISH probes in the United States are strictly regulated by the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the Centers for Medicare & Medicaid Services (CMS) the Clinical Laboratory Improvement Amendments (CLIA), and the College of American Pathologists (CAP). Although home-brewed FISH probes-defined as probes made in-house or acquired from a source that does not supply them to other laboratories-are not regulated by these agencies, they too must undergo the same individual validation process prior to clinical use as their commercial counterparts. Validation of a FISH probe involves initial validation and ongoing verification of the test system. Initial validation includes assessment of a probe's technical specifications, establishment of its standard operational procedure (SOP), determination of its clinical sensitivity and specificity, development of its cutoff, baseline, and normal reference ranges, gathering of analytics, confirmation of its applicability to a specific research or clinical setting, testing of samples with or without the abnormalities that the probe is meant to detect, staff training, and report building. Ongoing verification of the test system involves testing additional normal and abnormal samples using the same method employed during the initial validation of the probe.

  4. Validation of a New Procedure for Impedance Eduction in Flow

    NASA Technical Reports Server (NTRS)

    Watson, W. R.; Jones, M. G.

    2010-01-01

    A new impedance eduction procedure is validated by comparing the educed impedance spectrum to that of an older but well-tested eduction procedure. The older procedure requires the installation of a microphone array in the liner test section but the new procedure removes this requirement. A 12.7-mm stainless steel plate and a conventional liner consisting of a perforated plate bonded to a honeycomb core are tested. Test data is acquired from a grazing flow, impedance tube facility for a range of source frequencies and mean flow Mach numbers for which only plane waves are cut on. For the stainless steel plate, the educed admittance spectrum using the new procedure shows an improvement over that of the old procedure. This improvement shows up primarily in the educed conductance spectrum. Both eduction procedures show discrepancies in educed admittance in the mid-frequency range. Indications are that this discrepancy is triggered by an inconsistency between the measured eduction data (that contains boundary layer effects) and the two eduction models (for which the boundary layer is neglected). For the conventional liner, both eduction procedures are in very good agreement with each other. Small discrepancies occur for one or two frequencies in the mid-frequency range and for frequencies beyond the cut on frequency of higher-order duct modes. This discrepancy in the midfrequency range occurs because an automated optimizer is used to educe the impedance and the objective function used by the optimizer is extremely flat and therefore sensitive to initial starting values. The discrepancies at frequencies beyond the cut on frequency of higher order duct modes are due to the assumption of only plane waves in the impedance eduction model, although higher order modes are propagating in the impedance tube facility.

  5. Assessing Attachment Security With the Attachment Q Sort: Meta-Analytic Evidence for the Validity of the Observer AQS

    ERIC Educational Resources Information Center

    van I Jzendoorn,Marinus H.; Vereijken, Carolus M.J.L.; Bakermans-Kranenburg, Marian J.; Riksen-Walraven, Marianne J.

    2004-01-01

    The reliability and validity of the Attachment Q Sort (AQS; Waters & Deane, 1985) was tested in a series of meta-analyses on 139 studies with 13,835 children. The observer AQS security score showed convergent validity with Strange Situation procedure (SSP) security (r=31) and excellent predictive validity with sensitivity measures (r=39). Its…

  6. Validation of Student and Parent Reported Data on the Basic Grant Application Form, 1978-79 Comprehensive Validation Guide. Procedural Manual for: Validation of Cases Referred by Institutions; Validation of Cases Referred by the Office of Education; Recovery of Overpayments.

    ERIC Educational Resources Information Center

    Smith, Karen; And Others

    Procedures for validating data reported by students and parents on an application for Basic Educational Opportunity Grants were developed in 1978 for the U.S. Office of Education (OE). Validation activities include: validation of flagged Student Eligibility Reports (SERs) for students whose schools are part of the Alternate Disbursement System;…

  7. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    PubMed

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  8. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    PubMed

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Analytical Procedures for Testability.

    DTIC Science & Technology

    1983-01-01

    Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum

  10. Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilches-Freixas, Gloria; Létang, Jean Michel; Rit,

    2016-09-15

    Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performedmore » at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation

  11. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Is Earth-based scaling a valid procedure for calculating heat flows for Mars?

    NASA Astrophysics Data System (ADS)

    Ruiz, Javier; Williams, Jean-Pierre; Dohm, James M.; Fernández, Carlos; López, Valle

    2013-09-01

    Heat flow is a very important parameter for constraining the thermal evolution of a planetary body. Several procedures for calculating heat flows for Mars from geophysical or geological proxies have been used, which are valid for the time when the structures used as indicators were formed. The more common procedures are based on estimates of lithospheric strength (the effective elastic thickness of the lithosphere or the depth to the brittle-ductile transition). On the other hand, several works by Kargel and co-workers have estimated martian heat flows from scaling the present-day terrestrial heat flow to Mars, but the so-obtained values are much higher than those deduced from lithospheric strength. In order to explain the discrepancy, a recent paper by Rodriguez et al. (Rodriguez, J.A.P., Kargel, J.S., Tanaka, K.L., Crown, D.A., Berman, D.C., Fairén, A.G., Baker, V.R., Furfaro, R., Candelaria, P., Sasaki, S. [2011]. Icarus 213, 150-194) criticized the heat flow calculations for ancient Mars presented by Ruiz et al. (Ruiz, J., Williams, J.-P., Dohm, J.M., Fernández, C., López, V. [2009]. Icarus 207, 631-637) and other studies calculating ancient martian heat flows from lithospheric strength estimates, and casted doubts on the validity of the results obtained by these works. Here however we demonstrate that the discrepancy is due to computational and conceptual errors made by Kargel and co-workers, and we conclude that the scaling from terrestrial heat flow values is not a valid procedure for estimating reliable heat flows for Mars.

  13. Validity of a semantically cued recall procedure for the mini-mental state examination.

    PubMed

    Yuspeh, R L; Vanderploeg, R D; Kershaw, D A

    1998-10-01

    The validity of supplementing the three-item recall portion of the Mini-Mental State Examination (MMSE) with a cued recall procedure to help specify the nature of patients' memory problems was examined. Subjects were 247 individuals representing three diagnostic groups: Alzheimer's disease (AD), subcortical vascular ischemic dementia (SVaD), and normal controls. Individuals were administered a battery of neuropsychological tests, including the MMSE, as part of a comprehensive evaluation for the presence of dementia or other neurologic disorder. MMSE performance differed among groups. The three-item free recall performance also differed among groups, with post hoc analyses revealing the AD and SVaD groups were more impaired than controls but did not differ significantly from each other. Following a cued recall procedure of the MMSE three-items, groups differed, with post hoc analyses showing that AD patients failed to benefit from cues, whereas SVaD patients performed significantly better and comparable to control subjects. Significant correlations between the MMSE three-item cued recall performance and other memory measures demonstrated concurrent validity. Consistent with previous research indicating that SVaD is associated with memory encoding and retrieval deficits, whereas AD is associated with consolidation and storage problems, the present study supported the validity of the cued recall procedure of the three items on the MMSE in helping to distinguish between patients with AD and those with a vascular dementia with primarily subcortical pathology; however, despite these findings, a more extensive battery of neuropsychological measures is still recommended to consistently assess subtle diagnostic differences in these memory processes.

  14. Analytical Performance Modeling and Validation of Intel’s Xeon Phi Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chunduri, Sudheer; Balaprakash, Prasanna; Morozov, Vitali

    Modeling the performance of scientific applications on emerging hardware plays a central role in achieving extreme-scale computing goals. Analytical models that capture the interaction between applications and hardware characteristics are attractive because even a reasonably accurate model can be useful for performance tuning before the hardware is made available. In this paper, we develop a hardware model for Intel’s second-generation Xeon Phi architecture code-named Knights Landing (KNL) for the SKOPE framework. We validate the KNL hardware model by projecting the performance of mini-benchmarks and application kernels. The results show that our KNL model can project the performance with prediction errorsmore » of 10% to 20%. The hardware model also provides informative recommendations for code transformations and tuning.« less

  15. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    PubMed

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  16. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 14 2010-07-01 2010-07-01 false Alternative Validation Procedure for EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) Pt. 63, App. D Appendix D to Part 63—Alternative Validation...

  17. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 14 2011-07-01 2011-07-01 false Alternative Validation Procedure for EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) Pt. 63, App. D Appendix D to Part 63—Alternative Validation...

  18. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 15 2014-07-01 2014-07-01 false Alternative Validation Procedure for EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) Pt. 63, App. D Appendix D to Part 63—Alternative Validation...

  19. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 15 2013-07-01 2013-07-01 false Alternative Validation Procedure for EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) Pt. 63, App. D Appendix D to Part 63—Alternative Validation...

  20. 40 CFR Appendix D to Part 63 - Alternative Validation Procedure for EPA Waste and Wastewater Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 15 2012-07-01 2012-07-01 false Alternative Validation Procedure for EPA Waste and Wastewater Methods D Appendix D to Part 63 Protection of Environment ENVIRONMENTAL... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) Pt. 63, App. D Appendix D to Part 63—Alternative Validation...

  1. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  2. Analysis procedures and subjective flight results of a simulator validation and cue fidelity experiment

    NASA Technical Reports Server (NTRS)

    Carr, Peter C.; Mckissick, Burnell T.

    1988-01-01

    A joint experiment to investigate simulator validation and cue fidelity was conducted by the Dryden Flight Research Facility of NASA Ames Research Center (Ames-Dryden) and NASA Langley Research Center. The primary objective was to validate the use of a closed-loop pilot-vehicle mathematical model as an analytical tool for optimizing the tradeoff between simulator fidelity requirements and simulator cost. The validation process includes comparing model predictions with simulation and flight test results to evaluate various hypotheses for differences in motion and visual cues and information transfer. A group of five pilots flew air-to-air tracking maneuvers in the Langley differential maneuvering simulator and visual motion simulator and in an F-14 aircraft at Ames-Dryden. The simulators used motion and visual cueing devices including a g-seat, a helmet loader, wide field-of-view horizon, and a motion base platform.

  3. Characterization of the olfactory impact around a wastewater treatment plant: optimization and validation of a hydrogen sulfide determination procedure based on passive diffusion sampling.

    PubMed

    Colomer, Fernando Llavador; Espinós-Morató, Héctor; Iglesias, Enrique Mantilla; Pérez, Tatiana Gómez; Campos-Candel, Andreu; Lozano, Caterina Coll

    2012-08-01

    A monitoring program based on an indirect method was conducted to assess the approximation of the olfactory impact in several wastewater treatment plants (in the present work, only one is shown). The method uses H2S passive sampling using Palmes-type diffusion tubes impregnated with silver nitrate and fluorometric analysis employing fluorescein mercuric acetate. The analytical procedure was validated in the exposure chamber. Exposure periods ofat least 4 days are recommended. The quantification limit of the procedure is 0.61 ppb for a 5-day sampling, which allows the H2S immission (ground concentration) level to be measured within its low odor threshold, from 0.5 to 300 ppb. Experimental results suggest an exposure time greater than 4 days, while recovery efficiency of the procedure, 93.0+/-1.8%, seems not to depend on the amount of H2S collected by the samplers within their application range. The repeatability, expressed as relative standard deviation, is lower than 7%, which is within the limits normally accepted for this type of sampler. Statistical comparison showed that this procedure and the reference method provide analogous accuracy. The proposed procedure was applied in two experimental campaigns, one intensive and the other extensive, and concentrations within the H2S low odor threshold were quantified at each sampling point. From these results, it can be concluded that the procedure shows good potential for monitoring the olfactory impact around facilities where H2S emissions are dominant.

  4. Analytical Method Validation of High-Performance Liquid Chromatography and Stability-Indicating Study of Medroxyprogesterone Acetate Intravaginal Sponges

    PubMed Central

    Batrawi, Nidal; Wahdan, Shorouq; Abualhasan, Murad

    2017-01-01

    Medroxyprogesterone acetate is widely used in veterinary medicine as intravaginal dosage for the synchronization of breeding cycle in ewes and goats. The main goal of this study was to develop reverse-phase high-performance liquid chromatography method for the quantification of medroxyprogesterone acetate in veterinary vaginal sponges. A single high-performance liquid chromatography/UV isocratic run was used for the analytical assay of the active ingredient medroxyprogesterone. The chromatographic system consisted of a reverse-phase C18 column as the stationary phase and a mixture of 60% acetonitrile and 40% potassium dihydrogen phosphate buffer as the mobile phase; the pH was adjusted to 5.6. The method was validated according to the International Council for Harmonisation (ICH) guidelines. Forced degradation studies were also performed to evaluate the stability-indicating properties and specificity of the method. Medroxyprogesterone was eluted at 5.9 minutes. The linearity of the method was confirmed in the range of 0.0576 to 0.1134 mg/mL (R2 > 0.999). The limit of quantification was shown to be 3.9 µg/mL. Precision and accuracy ranges were found to be %RSD <0.2 and 98% to 102%, respectively. Medroxyprogesterone capacity factor value of 2.1, tailing factor value of 1.03, and resolution value of 3.9 were obtained in accordance with ICH guidelines. Based on the obtained results, a rapid, precise, accurate, sensitive, and cost-effective analysis procedure was proposed for quantitative determination of medroxyprogesterone in vaginal sponges. This analytical method is the only available method to analyse medroxyprogesterone in veterinary intravaginal dosage form. PMID:28469407

  5. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR NEUTRAL TARGET ANALYTES (SOP-5.25)

    EPA Science Inventory

    This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.

  6. Analysis of Thiodiglycol: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS777

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for the analysis of thiodiglycol, the breakdown product of the sulfur mustard HD, in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS777 (hereafter referred to as EPA CRL SOP MS777). This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to verifymore » the analytical procedures described in MS777 for analysis of thiodiglycol in aqueous samples. The gathered data from this study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS777 can be determined.« less

  7. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. CTEPP STANDARD OPERATING PROCEDURE FOR DETECTION AND QUANTIFICATION OF TARGET ANALYTES BY GAS CHROMATOGRAPHY/MASS SPECTROMETRY (GC/MS) (SOP-5.24)

    EPA Science Inventory

    This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.

  9. Experimental validation of an analytical kinetic model for edge-localized modes in JET-ITER-like wall

    NASA Astrophysics Data System (ADS)

    Guillemaut, C.; Metzger, C.; Moulton, D.; Heinola, K.; O’Mullane, M.; Balboa, I.; Boom, J.; Matthews, G. F.; Silburn, S.; Solano, E. R.; contributors, JET

    2018-06-01

    The design and operation of future fusion devices relying on H-mode plasmas requires reliable modelling of edge-localized modes (ELMs) for precise prediction of divertor target conditions. An extensive experimental validation of simple analytical predictions of the time evolution of target plasma loads during ELMs has been carried out here in more than 70 JET-ITER-like wall H-mode experiments with a wide range of conditions. Comparisons of these analytical predictions with diagnostic measurements of target ion flux density, power density, impact energy and electron temperature during ELMs are presented in this paper and show excellent agreement. The analytical predictions tested here are made with the ‘free-streaming’ kinetic model (FSM) which describes ELMs as a quasi-neutral plasma bunch expanding along the magnetic field lines into the Scrape-Off Layer without collisions. Consequences of the FSM on energy reflection and deposition on divertor targets during ELMs are also discussed.

  10. Analytical method validation to evaluate dithiocarbamates degradation in biobeds in South of Brazil.

    PubMed

    Vareli, Catiucia S; Pizzutti, Ionara R; Gebler, Luciano; Cardoso, Carmem D; Gai, Daniela S H; Fontana, Marlos E Z

    2018-07-01

    In order to evaluate the efficiency of biobeds on DTC degradation, the aim of this study was to apply, optimize and validate a method to determine dithiocarbamate (mancozeb) in biobeds using gas chromatography-tandem mass spectrometry (GC-MS). The DTC pesticide mancozeb was hydrolysed in a tin (II) chloride solution at 1.5% in HCl (4 mol L -1 ), during 1 h in a water bath at 80 °C, and the CS 2 formed was extracted in isooctane. After cooling, 1 mL of the organic layer was transferred to an auto sampler vial and analyzed by GC-MS. A complete validation study was performed and the following parameters were assessed: linearity of the analytical curve (r 2 ), estimated method and instrument limits of detection and limits of quantification (LODm, LODi, LOQm and LOQi, respectively), accuracy (recovery%), precision (RSD%) and matrix effects. Recovery experiments were carried out with a standard spiking solution of the DTC pesticide thiram. Blank biobed (biomixture) samples were spiked at the three levels corresponding to the CS 2 concentrations of 1, 3 and 5 mg kg -1 , with seven replicates each (n = 7). The method presented satisfactory accuracy, with recoveries within the range of 89-96% and RSD ≤ 11%. The analytical curves were linear in the concentration range of 0.05-10 µg CS 2 mL -1 (r 2 > 0.9946). LODm and LOQm were 0.1 and 0.5 mg CS 2 kg -1 , respectively, and the calculated matrix effects were not significant (≤ 20%). The validated method was applied to 80 samples (biomixture), from sixteen different biobeds (collected at five sampling times) during fourteen months. Ten percent of samples presented CS 2 concentration below the LOD (0.1 mg CS 2 kg -1 ) and 49% of them showed results below the LOQ (0.5 mg CS 2 kg -1 ), which demonstrates the biobeds capability to degrade DTC. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10/sup -1/ and 10/sup -3/ percent are achieved. The experimental field work consisted of performing special tests in a large test sand fillmore » to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results.« less

  12. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    PubMed

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  13. March 2017 Grenada Manufacturing, LLC Data Validation Reports and Analytical Laboratory Reports for the Main Plant Building Vapor Intrusion Sampling

    EPA Pesticide Factsheets

    Data Validation Reports and Full Analytical Lab Reports for Indoor Air, Ambient Air and Sub-slab samples taken during the facility vapor intrusion investigation in March 2017 at the Grenada Manufacturing plant

  14. A Response to "Measuring Students' Writing Ability on a Computer Analytic Developmental Scale: An Exploratory Validity Study"

    ERIC Educational Resources Information Center

    Reutzel, D. Ray; Mohr, Kathleen A. J.

    2014-01-01

    In this response to "Measuring Students' Writing Ability on a Computer Analytic Developmental Scale: An Exploratory Validity Study," the authors agree that assessments should seek parsimony in both theory and application wherever possible. Doing so allows maximal dissemination and implementation while minimizing costs. The Writing…

  15. Validation of the analytical method for the simultaneous determination of selected polybrominated diphenyl ethers, polychlorinated biphenyls and organochlorine pesticides in human blood serum by gas chromatography with microelectron capture detector.

    PubMed

    Matuszak, Małgorzata; Minorczyk, Maria; Góralczyk, Katarzyna; Hernik, Agnieszka; Struciński, Paweł; Liszewska, Monika; Czaja, Katarzyna; Korcz, Wojciech; Łyczewska, Monika; Ludwicki, Jan K

    2016-01-01

    Polybrominated diphenyl ethers (PBDEs) as other persistent organic pollutants like polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs) pose a significant hazard to human health, mainly due to interference with the endocrine system and carcinogenetic effects. Humans are exposed to these substances mainly through a food of animal origin. These pollutants are globally detected in human matrices which requires to dispose reliable and simple analytical method that would enable further studies to assess the exposure of specific human populations to these compounds. The purpose of this study was to modify and validate of the analytical procedure for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum samples. The analytical measurement was performed by GC-µECD following preparation of serum samples (denaturation, multiple extraction, lipid removal). Identity of the compounds was confirmed by GC-MS. The method was characterised by the appropriate linearity, good repeatability (CV below 20%). The recoveries ranged from 52.9 to 125.0% depending on compound and level of fortification. The limit of quantification was set at 0.03 ng mL(-1) of serum. The modified analytical method proved to be suitable for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum by GC-µECD with good precision.

  16. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  17. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  18. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  19. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  20. Spatial distribution of cosmetic-procedure businesses in two U.S. cities: a pilot mapping and validation study.

    PubMed

    Austin, S Bryn; Gordon, Allegra R; Kennedy, Grace A; Sonneville, Kendrin R; Blossom, Jeffrey; Blood, Emily A

    2013-12-06

    Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA) and Seattle (WA), USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry.

  1. Spatial Distribution of Cosmetic-Procedure Businesses in Two U.S. Cities: A Pilot Mapping and Validation Study

    PubMed Central

    Austin, S. Bryn; Gordon, Allegra R.; Kennedy, Grace A.; Sonneville, Kendrin R.; Blossom, Jeffrey; Blood, Emily A.

    2013-01-01

    Cosmetic procedures have proliferated rapidly over the past few decades, with over $11 billion spent on cosmetic surgeries and other minimally invasive procedures and another $2.9 billion spent on U.V. indoor tanning in 2012 in the United States alone. While research interest is increasing in tandem with the growth of the industry, methods have yet to be developed to identify and geographically locate the myriad types of businesses purveying cosmetic procedures. Geographic location of cosmetic-procedure businesses is a critical element in understanding the public health impact of this industry; however no studies we are aware of have developed valid and feasible methods for spatial analyses of these types of businesses. The aim of this pilot validation study was to establish the feasibility of identifying businesses offering surgical and minimally invasive cosmetic procedures and to characterize the spatial distribution of these businesses. We developed and tested three methods for creating a geocoded list of cosmetic-procedure businesses in Boston (MA) and Seattle (WA), USA, comparing each method on sensitivity and staff time required per confirmed cosmetic-procedure business. Methods varied substantially. Our findings represent an important step toward enabling rigorous health-linked spatial analyses of the health implications of this little-understood industry. PMID:24322394

  2. A confirmatory factor analytic validation of the Tinnitus Handicap Inventory.

    PubMed

    Kleinstäuber, Maria; Frank, Ina; Weise, Cornelia

    2015-03-01

    Because the postulated three-factor structure of the internationally widely used Tinnitus Handicap Inventory (THI) has not been confirmed yet by a confirmatory factor analytic approach this was the central aim of the current study. From a clinical setting, N=373 patients with chronic tinnitus completed the THI and further questionnaires assessing tinnitus-related and psychological variables. In order to analyze the psychometric properties of the THI, confirmatory factor analysis (CFA) and correlational analyses were conducted. CFA provided a statistically significant support for a better fit of the data to the hypothesized three-factor structure (RMSEA=.049, WRMR=1.062, CFI=.965, TLI=.961) than to a general factor model (RMSEA=.062, WRMR=1.258, CFI=.942, TLI=.937). The calculation of Cronbach's alpha as indicator of internal consistency revealed satisfactory values (.80-.91) with the exception of the catastrophic subscale (.65). High positive correlations of the THI and its subscales with other measures of tinnitus distress, anxiety, and depression, high negative correlations with tinnitus acceptance, moderate positive correlations with anxiety sensitivity, sleeping difficulties, tinnitus loudness, and small correlations with the Big Five personality dimensions confirmed construct validity. Results show that the THI is a highly reliable and valid measure of tinnitus-related handicap. In contrast to results of previous exploratory analyses the current findings speak for a three-factor in contrast to a unifactorial structure. Future research is needed to replicate this result in different tinnitus populations. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Class-modelling in food analytical chemistry: Development, sampling, optimisation and validation issues - A tutorial.

    PubMed

    Oliveri, Paolo

    2017-08-22

    Qualitative data modelling is a fundamental branch of pattern recognition, with many applications in analytical chemistry, and embraces two main families: discriminant and class-modelling methods. The first strategy is appropriate when at least two classes are meaningfully defined in the problem under study, while the second strategy is the right choice when the focus is on a single class. For this reason, class-modelling methods are also referred to as one-class classifiers. Although, in the food analytical field, most of the issues would be properly addressed by class-modelling strategies, the use of such techniques is rather limited and, in many cases, discriminant methods are forcedly used for one-class problems, introducing a bias in the outcomes. Key aspects related to the development, optimisation and validation of suitable class models for the characterisation of food products are critically analysed and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    PubMed

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  5. Method development and validation for simultaneous quantification of 15 drugs of abuse and prescription drugs and 7 of their metabolites in whole blood relevant in the context of driving under the influence of drugs--usefulness of multi-analyte calibration.

    PubMed

    Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas

    2014-11-01

    In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests

  6. Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure Validation Simulation Study

    NASA Technical Reports Server (NTRS)

    Murdoch, Jennifer L.; Bussink, Frank J. L.; Chamberlain, James P.; Chartrand, Ryan C.; Palmer, Michael T.; Palmer, Susan O.

    2008-01-01

    The Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure (ITP) Validation Simulation Study investigated the viability of an ITP designed to enable oceanic flight level changes that would not otherwise be possible. Twelve commercial airline pilots with current oceanic experience flew a series of simulated scenarios involving either standard or ITP flight level change maneuvers and provided subjective workload ratings, assessments of ITP validity and acceptability, and objective performance measures associated with the appropriate selection, request, and execution of ITP flight level change maneuvers. In the majority of scenarios, subject pilots correctly assessed the traffic situation, selected an appropriate response (i.e., either a standard flight level change request, an ITP request, or no request), and executed their selected flight level change procedure, if any, without error. Workload ratings for ITP maneuvers were acceptable and not substantially higher than for standard flight level change maneuvers, and, for the majority of scenarios and subject pilots, subjective acceptability ratings and comments for ITP were generally high and positive. Qualitatively, the ITP was found to be valid and acceptable. However, the error rates for ITP maneuvers were higher than for standard flight level changes, and these errors may have design implications for both the ITP and the study's prototype traffic display. These errors and their implications are discussed.

  7. Assessing Women's Responses to Sexual Threat: Validity of a Virtual Role-Play Procedure

    ERIC Educational Resources Information Center

    Jouriles, Ernest N.; Rowe, Lorelei Simpson; McDonald, Renee; Platt, Cora G.; Gomez, Gabriella S.

    2011-01-01

    This study evaluated the validity of a role-play procedure that uses virtual reality technology to assess women's responses to sexual threat. Forty-eight female undergraduate students were randomly assigned to either a standard, face-to-face role-play (RP) or a virtual role-play (VRP) of a sexually coercive situation. A multimethod assessment…

  8. Characterization of the olfactory impact around a wastewater treatment plant: Optimization and validation of a hydrogen sulfide determination procedure based on passive diffusion sampling.

    PubMed

    Colomer, Fernando Llavador; Espinós-Morató, Héctor; Iglesias, Enrique Mantilla; Pérez, Tatiana Gómez; Campos-Candel, Andreu; Coll Lozano, Caterina

    2012-08-01

    A monitoring program based on an indirect method was conducted to assess the approximation of the olfactory impact in several wastewater treatment plants (in the present work, only one is shown). The method uses H 2 S passive sampling using Palmes-type diffusion tubes impregnated with silver nitrate and fluorometric analysis employing fluorescein mercuric acetate. The analytical procedure was validated in the exposure chamber. Exposure periods of at least 4 days are recommended. The quantification limit of the procedure is 0.61 ppb for a 5-day sampling, which allows the H 2 S immission (ground concentration) level to be measured within its low odor threshold, from 0.5 to 300 ppb. Experimental results suggest an exposure time greater than 4 days, while recovery efficiency of the procedure, 93.0 ± 1.8%, seems not to depend on the amount of H 2 S collected by the samplers within their application range. The repeatability, expressed as relative standard deviation, is lower than 7%, which is within the limits normally accepted for this type of sampler. Statistical comparison showed that this procedure and the reference method provide analogous accuracy. The proposed procedure was applied in two experimental campaigns, one intensive and the other extensive, and concentrations within the H 2 S low odor threshold were quantified at each sampling point. From these results, it can be concluded that the procedure shows good potential for monitoring the olfactory impact around facilities where H 2 S emissions are dominant. [Box: see text].

  9. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  10. A competency based selection procedure for Dutch postgraduate GP training: a pilot study on validity and reliability.

    PubMed

    Vermeulen, Margit I; Tromp, Fred; Zuithoff, Nicolaas P A; Pieters, Ron H M; Damoiseaux, Roger A M J; Kuyvenhoven, Marijke M

    2014-12-01

    Abstract Background: Historically, semi-structured interviews (SSI) have been the core of the Dutch selection for postgraduate general practice (GP) training. This paper describes a pilot study on a newly designed competency-based selection procedure that assesses whether candidates have the competencies that are required to complete GP training. The objective was to explore reliability and validity aspects of the instruments developed. The new selection procedure comprising the National GP Knowledge Test (LHK), a situational judgement tests (SJT), a patterned behaviour descriptive interview (PBDI) and a simulated encounter (SIM) was piloted alongside the current procedure. Forty-seven candidates volunteered in both procedures. Admission decision was based on the results of the current procedure. Study participants did hardly differ from the other candidates. The mean scores of the candidates on the LHK and SJT were 21.9 % (SD 8.7) and 83.8% (SD 3.1), respectively. The mean self-reported competency scores (PBDI) were higher than the observed competencies (SIM): 3.7(SD 0.5) and 2.9(SD 0.6), respectively. Content-related competencies showed low correlations with one another when measured with different instruments, whereas more diverse competencies measured by a single instrument showed strong to moderate correlations. Moreover, a moderate correlation between LHK and SJT was found. The internal consistencies (intraclass correlation, ICC) of LHK and SJT were poor while the ICC of PBDI and SIM showed acceptable levels of reliability. Findings on content validity and reliability of these new instruments are promising to realize a competency based procedure. Further development of the instruments and research on predictive validity should be pursued.

  11. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  12. An analytic solution for numerical modeling validation in electromagnetics: the resistive sphere

    NASA Astrophysics Data System (ADS)

    Swidinsky, Andrei; Liu, Lifei

    2017-11-01

    We derive the electromagnetic response of a resistive sphere to an electric dipole source buried in a conductive whole space. The solution consists of an infinite series of spherical Bessel functions and associated Legendre polynomials, and follows the well-studied problem of a conductive sphere buried in a resistive whole space in the presence of a magnetic dipole. Our result is particularly useful for controlled-source electromagnetic problems using a grounded electric dipole transmitter and can be used to check numerical methods of calculating the response of resistive targets (such as finite difference, finite volume, finite element and integral equation). While we elect to focus on the resistive sphere in our examples, the expressions in this paper are completely general and allow for arbitrary source frequency, sphere radius, transmitter position, receiver position and sphere/host conductivity contrast so that conductive target responses can also be checked. Commonly used mesh validation techniques consist of comparisons against other numerical codes, but such solutions may not always be reliable or readily available. Alternatively, the response of simple 1-D models can be tested against well-known whole space, half-space and layered earth solutions, but such an approach is inadequate for validating models with curved surfaces. We demonstrate that our theoretical results can be used as a complementary validation tool by comparing analytic electric fields to those calculated through a finite-element analysis; the software implementation of this infinite series solution is made available for direct and immediate application.

  13. Pilot In-Trail Procedure Validation Simulation Study

    NASA Technical Reports Server (NTRS)

    Bussink, Frank J. L.; Murdoch, Jennifer L.; Chamberlain, James P.; Chartrand, Ryan; Jones, Kenneth M.

    2008-01-01

    A Human-In-The-Loop experiment was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) to investigate the viability of the In-Trail Procedure (ITP) concept from a flight crew perspective, by placing participating airline pilots in a simulated oceanic flight environment. The test subject pilots used new onboard avionics equipment that provided improved information about nearby traffic and enabled them, when specific criteria were met, to request an ITP flight level change referencing one or two nearby aircraft that might otherwise block the flight level change. The subject pilots subjective assessments of ITP validity and acceptability were measured via questionnaires and discussions, and their objective performance in appropriately selecting, requesting, and performing ITP flight level changes was evaluated for each simulated flight scenario. Objective performance and subjective workload assessment data from the experiment s test conditions were analyzed for statistical and operational significance and are reported in the paper. Based on these results, suggestions are made to further improve the ITP.

  14. General Procedure for the Easy Calculation of pH in an Introductory Course of General or Analytical Chemistry

    ERIC Educational Resources Information Center

    Cepriá, Gemma; Salvatella, Luis

    2014-01-01

    All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…

  15. Analytical performance of a bronchial genomic classifier.

    PubMed

    Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean

    2016-02-26

    The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.

  16. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  17. A Meta-Analytic Review of the Cover-Copy-Compare and Variations of This Self-Management Procedure

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Konrad, Moira; Cates, Gary; Vajcner, Terra; Eveleigh, Elisha; Fishley, Katelyn M.

    2012-01-01

    Studies that examined copy-cover-compare (CCC) and variations of this procedure were reviewed and analyzed. This review revealed a substantial number of studies that validated the use of CCC across spelling and math skills and across students with and without disabilities. A meta-analysis of findings indicated that CCC and variations of this…

  18. Analytical and experimental procedures for determining propagation characteristics of millimeter-wave gallium arsenide microstrip lines

    NASA Technical Reports Server (NTRS)

    Romanofsky, Robert R.

    1989-01-01

    In this report, a thorough analytical procedure is developed for evaluating the frequency-dependent loss characteristics and effective permittivity of microstrip lines. The technique is based on the measured reflection coefficient of microstrip resonator pairs. Experimental data, including quality factor Q, effective relative permittivity, and fringing for 50-omega lines on gallium arsenide (GaAs) from 26.5 to 40.0 GHz are presented. The effects of an imperfect open circuit, coupling losses, and loading of the resonant frequency are considered. A cosine-tapered ridge-guide text fixture is described. It was found to be well suited to the device characterization.

  19. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    PubMed

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  20. Validation of Immunohistochemical Assays for Integral Biomarkers in the NCI-MATCH EAY131 Clinical Trial.

    PubMed

    Khoury, Joseph D; Wang, Wei-Lien; Prieto, Victor G; Medeiros, L Jeffrey; Kalhor, Neda; Hameed, Meera; Broaddus, Russell; Hamilton, Stanley R

    2018-02-01

    Biomarkers that guide therapy selection are gaining unprecedented importance as targeted therapy options increase in scope and complexity. In conjunction with high-throughput molecular techniques, therapy-guiding biomarker assays based upon immunohistochemistry (IHC) have a critical role in cancer care in that they inform about the expression status of a protein target. Here, we describe the validation procedures for four clinical IHC biomarker assays-PTEN, RB, MLH1, and MSH2-for use as integral biomarkers in the nationwide NCI-Molecular Analysis for Therapy Choice (NCI-MATCH) EAY131 clinical trial. Validation procedures were developed through an iterative process based on collective experience and adaptation of broad guidelines from the FDA. The steps included primary antibody selection; assay optimization; development of assay interpretation criteria incorporating biological considerations; and expected staining patterns, including indeterminate results, orthogonal validation, and tissue validation. Following assay lockdown, patient samples and cell lines were used for analytic and clinical validation. The assays were then approved as laboratory-developed tests and used for clinical trial decisions for treatment selection. Calculations of sensitivity and specificity were undertaken using various definitions of gold-standard references, and external validation was required for the PTEN IHC assay. In conclusion, validation of IHC biomarker assays critical for guiding therapy in clinical trials is feasible using comprehensive preanalytic, analytic, and postanalytic steps. Implementation of standardized guidelines provides a useful framework for validating IHC biomarker assays that allow for reproducibility across institutions for routine clinical use. Clin Cancer Res; 24(3); 521-31. ©2017 AACR . ©2017 American Association for Cancer Research.

  1. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  2. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    PubMed

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  3. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Quality control of colonoscopy procedures: a prospective validated method for the evaluation of professional practices applicable to all endoscopic units.

    PubMed

    Coriat, R; Pommaret, E; Chryssostalis, A; Viennot, S; Gaudric, M; Brezault, C; Lamarque, D; Roche, H; Verdier, D; Parlier, D; Prat, F; Chaussade, S

    2009-02-01

    To produce valid information, an evaluation of professional practices has to assess the quality of all practices before, during and after the procedure under study. Several auditing techniques have been proposed for colonoscopy. The purpose of this work is to describe a straightforward original validated method for the prospective evaluation of professional practices in the field of colonoscopy applicable in all endoscopy units without increasing the staff work load. Pertinent quality-control criteria (14 items) were identified by the endoscopists at the Cochin Hospital and were compatible with: findings in the available literature; guidelines proposed by the Superior Health Authority; and application in any endoscopy unit. Prospective routine data were collected and the methodology validated by evaluating 50 colonoscopies every quarter for one year. The relevance of the criteria was assessed using data collected during four separate periods. The standard checklist was complete for 57% of the colonoscopy procedures. The colonoscopy procedure was appropriate according to national guidelines in 94% of cases. These observations were particularly noteworthy: the quality of the colonic preparation was insufficient for 9% of the procedures; complete colonoscopy was achieved for 93% of patients; and 0.38 adenomas and 0.045 carcinomas were identified per colonoscopy. This simple and reproducible method can be used for valid quality-control audits in all endoscopy units. In France, unit-wide application of this method enables endoscopists to validate 100 of the 250 points required for continuous medical training. This is a quality-control tool that can be applied annually, using a random month to evaluate any changes in routine practices.

  5. Dynamic variational asymptotic procedure for laminated composite shells

    NASA Astrophysics Data System (ADS)

    Lee, Chang-Yong

    Unlike published shell theories, the main two parts of this thesis are devoted to the asymptotic construction of a refined theory for composite laminated shells valid over a wide range of frequencies and wavelengths. The resulting theory is applicable to shells each layer of which is made of materials with monoclinic symmetry. It enables one to analyze shell dynamic responses within both long-wavelength, low- and high-frequency vibration regimes. It also leads to energy functionals that are both positive definiteness and sufficient simplicity for all wavelengths. This whole procedure was first performed analytically. From the insight gained from the procedure, a finite element version of the analysis was then developed; and a corresponding computer program, DVAPAS, was developed. DVAPAS can obtain the generalized 2-D constitutive law and recover accurately the 3-D results for stress and strain in composite shells. Some independent works will be needed to develop the corresponding 2-D surface analysis associated with the present theory and to continue towards full verification and validation of the present process by comparison with available published works.

  6. Validation of biomarkers of food intake-critical assessment of candidate biomarkers.

    PubMed

    Dragsted, L O; Gao, Q; Scalbert, A; Vergères, G; Kolehmainen, M; Manach, C; Brennan, L; Afman, L A; Wishart, D S; Andres Lacueva, C; Garcia-Aloy, M; Verhagen, H; Feskens, E J M; Praticò, G

    2018-01-01

    Biomarkers of food intake (BFIs) are a promising tool for limiting misclassification in nutrition research where more subjective dietary assessment instruments are used. They may also be used to assess compliance to dietary guidelines or to a dietary intervention. Biomarkers therefore hold promise for direct and objective measurement of food intake. However, the number of comprehensively validated biomarkers of food intake is limited to just a few. Many new candidate biomarkers emerge from metabolic profiling studies and from advances in food chemistry. Furthermore, candidate food intake biomarkers may also be identified based on extensive literature reviews such as described in the guidelines for Biomarker of Food Intake Reviews (BFIRev). To systematically and critically assess the validity of candidate biomarkers of food intake, it is necessary to outline and streamline an optimal and reproducible validation process. A consensus-based procedure was used to provide and evaluate a set of the most important criteria for systematic validation of BFIs. As a result, a validation procedure was developed including eight criteria, plausibility, dose-response, time-response, robustness, reliability, stability, analytical performance, and inter-laboratory reproducibility. The validation has a dual purpose: (1) to estimate the current level of validation of candidate biomarkers of food intake based on an objective and systematic approach and (2) to pinpoint which additional studies are needed to provide full validation of each candidate biomarker of food intake. This position paper on biomarker of food intake validation outlines the second step of the BFIRev procedure but may also be used as such for validation of new candidate biomarkers identified, e.g., in food metabolomic studies.

  7. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures.

    PubMed

    Ferrario, J; Byrne, C; Dupuy, A E

    1997-06-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  8. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures

    NASA Technical Reports Server (NTRS)

    Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr

    1997-01-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  9. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    PubMed

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  10. Analytical validation of an explicit finite element model of a rolling element bearing with a localised line spall

    NASA Astrophysics Data System (ADS)

    Singh, Sarabjeet; Howard, Carl Q.; Hansen, Colin H.; Köpke, Uwe G.

    2018-03-01

    In this paper, numerically modelled vibration response of a rolling element bearing with a localised outer raceway line spall is presented. The results were obtained from a finite element (FE) model of the defective bearing solved using an explicit dynamics FE software package, LS-DYNA. Time domain vibration signals of the bearing obtained directly from the FE modelling were processed further to estimate time-frequency and frequency domain results, such as spectrogram and power spectrum, using standard signal processing techniques pertinent to the vibration-based monitoring of rolling element bearings. A logical approach to analyses of the numerically modelled results was developed with an aim to presenting the analytical validation of the modelled results. While the time and frequency domain analyses of the results show that the FE model generates accurate bearing kinematics and defect frequencies, the time-frequency analysis highlights the simulation of distinct low- and high-frequency characteristic vibration signals associated with the unloading and reloading of the rolling elements as they move in and out of the defect, respectively. Favourable agreement of the numerical and analytical results demonstrates the validation of the results from the explicit FE modelling of the bearing.

  11. Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.

    PubMed

    Stupple, Edward J N; Waterhouse, Eleanor F

    2009-08-01

    An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.

  12. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of anymore » separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30

  13. Background Contamination by Coplanar Polychlorinated Biphenyls (PCBS) in Trace Level High Resolution Gas Chromatography/High Resolution Mass Spectrometry (HRGC/HRMS) Analytical Procedures

    EPA Science Inventory

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...

  14. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  15. Evaluation of three different validation procedures regarding the accuracy of template-guided implant placement: an in vitro study.

    PubMed

    Vasak, Christoph; Strbac, Georg D; Huber, Christian D; Lettner, Stefan; Gahleitner, André; Zechner, Werner

    2015-02-01

    The study aims to evaluate the accuracy of the NobelGuide™ (Medicim/Nobel Biocare, Göteborg, Sweden) concept maximally reducing the influence of clinical and surgical parameters. Moreover, the study was to compare and validate two validation procedures versus a reference method. Overall, 60 implants were placed in 10 artificial edentulous mandibles according to the NobelGuide™ protocol. For merging the pre- and postoperative DICOM data sets, three different fusion methods (Triple Scan Technique, NobelGuide™ Validation software, and AMIRA® software [VSG - Visualization Sciences Group, Burlington, MA, USA] as reference) were applied. Discrepancies between the virtual and the actual implant positions were measured. The mean deviations measured with AMIRA® were 0.49 mm (implant shoulder), 0.69 mm (implant apex), and 1.98°mm (implant axis). The Triple Scan Technique as well as the NobelGuide™ Validation software revealed similar deviations compared with the reference method. A significant correlation between angular and apical deviations was seen (r = 0.53; p < .001). A greater implant diameter was associated with greater deviations (p = .03). The Triple Scan Technique as a system-independent validation procedure as well as the NobelGuide™ Validation software are in accordance with the AMIRA® software. The NobelGuide™ system showed similar or less spatial and angular deviations compared with others. © 2013 Wiley Periodicals, Inc.

  16. MetaKTSP: a meta-analytic top scoring pair method for robust cross-study validation of omics prediction analysis.

    PubMed

    Kim, SungHwan; Lin, Chien-Wei; Tseng, George C

    2016-07-01

    Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All

  17. Pre-trial inter-laboratory analytical validation of the FOCUS4 personalised therapy trial.

    PubMed

    Richman, Susan D; Adams, Richard; Quirke, Phil; Butler, Rachel; Hemmings, Gemma; Chambers, Phil; Roberts, Helen; James, Michelle D; Wozniak, Sue; Bathia, Riya; Pugh, Cheryl; Maughan, Timothy; Jasani, Bharat

    2016-01-01

    Molecular characterisation of tumours is increasing personalisation of cancer therapy, tailored to an individual and their cancer. FOCUS4 is a molecularly stratified clinical trial for patients with advanced colorectal cancer. During an initial 16-week period of standard first-line chemotherapy, tumour tissue will undergo several molecular assays, with the results used for cohort allocation, then randomisation. Laboratories in Leeds and Cardiff will perform the molecular testing. The results of a rigorous pre-trial inter-laboratory analytical validation are presented and discussed. Wales Cancer Bank supplied FFPE tumour blocks from 97 mCRC patients with consent for use in further research. Both laboratories processed each sample according to an agreed definitive FOCUS4 laboratory protocol, reporting results directly to the MRC Trial Management Group for independent cross-referencing. Pyrosequencing analysis of mutation status at KRAS codons12/13/61/146, NRAS codons12/13/61, BRAF codon600 and PIK3CA codons542/545/546/1047, generated highly concordant results. Two samples gave discrepant results; in one a PIK3CA mutation was detected only in Leeds, and in the other, a PIK3CA mutation was only detected in Cardiff. pTEN and mismatch repair (MMR) protein expression was assessed by immunohistochemistry (IHC) resulting in 6/97 discordant results for pTEN and 5/388 for MMR, resolved upon joint review. Tumour heterogeneity was likely responsible for pyrosequencing discrepancies. The presence of signet-ring cells, necrosis, mucin, edge-effects and over-counterstaining influenced IHC discrepancies. Pre-trial assay analytical validation is essential to ensure appropriate selection of patients for targeted therapies. This is feasible for both mutation testing and immunohistochemical assays and must be built into the workup of such trials. ISRCTN90061564. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  18. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    PubMed

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  19. Toxicologic evaluation of analytes from Tank 241-C-103

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less

  20. Reliability and Validity of a Procedure to Measure Diagnostic Reasoning and Problem-Solving Skills Taught in Predoctoral Orthodontic Education.

    ERIC Educational Resources Information Center

    Albanese, Mark A.; Jacobs, Richard M.

    1990-01-01

    The reliability and validity of a procedure to measure diagnostic-reasoning and problem-solving skills taught in predoctoral orthodontic education were studied using 68 second year dental students. The procedure includes stimulus material and 33 multiple-choice items. It is a feasible way of assessing problem-solving skills in dentistry education…

  1. Wetting boundary condition for the color-gradient lattice Boltzmann method: Validation with analytical and experimental data

    NASA Astrophysics Data System (ADS)

    Akai, Takashi; Bijeljic, Branko; Blunt, Martin J.

    2018-06-01

    In the color gradient lattice Boltzmann model (CG-LBM), a fictitious-density wetting boundary condition has been widely used because of its ease of implementation. However, as we show, this may lead to inaccurate results in some cases. In this paper, a new scheme for the wetting boundary condition is proposed which can handle complicated 3D geometries. The validity of our method for static problems is demonstrated by comparing the simulated results to analytical solutions in 2D and 3D geometries with curved boundaries. Then, capillary rise simulations are performed to study dynamic problems where the three-phase contact line moves. The results are compared to experimental results in the literature (Heshmati and Piri, 2014). If a constant contact angle is assumed, the simulations agree with the analytical solution based on the Lucas-Washburn equation. However, to match the experiments, we need to implement a dynamic contact angle that varies with the flow rate.

  2. Assessing the Reliability and Validity of Multi-Attribute Utility Procedures: An Application of the Theory of Generalizability

    DTIC Science & Technology

    1975-07-01

    I WIWIHIHlipi pqpv<Hi^«^Rii.i ii mmw AD-A016 282 ASSESSING THE REALIBILITY AND VALIDITY OF MULTI-ATTRIBUTE UTILITY PROCEDURES: AN...more complicated and use data from actual experiments. Example 1: Analysis of raters making Importance judgments about attributes. In MAU studies...generaluablllty of JUDGE as contrasted to ÜASC. To do this, we win reanaIyze the data for each syste™ separately. This 1. valid since the initial

  3. VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR ...

    EPA Pesticide Factsheets

    There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a system with the ability to prepare and quickly analyze a large number of contaminated samples for the traditional chemical agents, as well as numerous toxic industrial chemicals. Liquid samples (both aqueous and organic), solid samples (e.g., soil), vapor samples (e.g., air) and mixed state samples, all ranging from household items to deceased animals, may require some level of analyses. To meet this challenge, the U.S. Environmental Protection Agency (U.S. EPA) National Homeland Security Research Center, in collaboration with experts from across U.S. EPA and other Federal Agencies, initiated an effort to identify analytical methods for the chemical and biological agents that could be used to respond to a terrorist attack or a homeland security incident. U.S. EPA began development of standard analytical protocols (SAPs) for laboratory identification and measurement of target agents in case of a contamination threat. These methods will be used to help assist in the identification of existing contamination, the effectiveness of decontamination, as well as clearance for the affected population to reoccupy previously contaminated areas. One of the first SAPs developed was for the determin

  4. ProMIS augmented reality training of laparoscopic procedures face validity.

    PubMed

    Botden, Sanne M B I; Buzink, Sonja N; Schijven, Marlies P; Jakimowicz, Jack J

    2008-01-01

    Conventional video trainers lack the ability to assess the trainee objectively, but offer modalities that are often missing in virtual reality simulation, such as realistic haptic feedback. The ProMIS augmented reality laparoscopic simulator retains the benefit of a traditional box trainer, by using original laparoscopic instruments and tactile tasks, but additionally generates objective measures of performance. Fifty-five participants performed a "basic skills" and "suturing and knot-tying" task on ProMIS, after which they filled out a questionnaire regarding realism, haptics, and didactic value of the simulator, on a 5-point-Likert scale. The participants were allotted to 2 experience groups: "experienced" (>50 procedures and >5 sutures; N = 27), and "moderately experienced" (<50 procedures and <5 sutures; N = 28). General consensus among all participants, particularly the experienced, was that ProMIS is a useful tool for training (mean: 4.67, SD: 0.48). It was considered very realistic (mean: 4.44, SD: 0.66), with good haptics (mean: 4.10, SD: 0.97) and didactic value (mean 4.10, SD: 0.65). This study established the face validity of the ProMIS augmented reality simulator for "basic skills" and "suturing and knot-tying" tasks. ProMIS was considered a good tool for training in laparoscopic skills for surgical residents and surgeons.

  5. Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study

    PubMed Central

    Konda, Ravi Kumar; Chandu, Babu Rao; Challa, B.R.; Kothapalli, Chandrasekhar B.

    2012-01-01

    The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm) column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. PMID:29403764

  6. An analytical procedure for the determination of aluminum used in antiperspirants on human skin in Franz™ diffusion cell.

    PubMed

    Guillard, Olivier; Fauconneau, Bernard; Favreau, Frédéric; Marrauld, Annie; Pineau, Alain

    2012-04-01

    A local case report of hyperaluminemia (aluminum concentration: 3.88 µmol/L) in a woman using an aluminum-containing antiperspirant for 4 years raises the question of possible transdermal uptake of aluminum salt as a future public health problem. Prior to studying the transdermal uptake of three commercialized cosmetic formulas, an analytical assay of aluminum (Al) in chlorohydrate form (ACH) by Zeeman Electrothermal Atomic Absorption Spectrophotometer (ZEAAS) in a clean room was optimized and validated. This analysis was performed with different media on human skin using a Franz(™) diffusion cell. The detection and quantification limits were set at ≤ 3 µg/L. Precision analysis as within-run (n = 12) and between-run (n = 15-68 days) yield CV ≤ 6%. The high analytic sensitivity (2-3 µg/L) and low variability should allow an in vitro study of the transdermal uptake of ACH.

  7. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    PubMed

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and <5% coefficient of variation (CV). The positive percent agreement (PPA) in all 12 tests achieved 100%, negative percent agreement (NPA) was higher than 83% except for one test (42.86%), and overall percent agreement (OPA) ranged 89.33-100%. We envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) using Currently Existing Flight Resources

    NASA Technical Reports Server (NTRS)

    Bingham, Gail; Bates, Scott; Bugbee, Bruce; Garland, Jay; Podolski, Igor; Levinskikh, Rita; Sychev, Vladimir; Gushin, Vadim

    2009-01-01

    Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) is a study to advance the technology required for plant growth in microgravity and to research related food safety issues. Lada-VPU-P3R also investigates the non-nutritional value to the flight crew of developing plants on-orbit. The Lada-VPU-P3R uses the Lada hardware on the ISS and falls under a cooperative agreement between National Aeronautics and Space Administration (NASA) and the Russian Federal Space Association (FSA). Research Summary: Validating Vegetable Production Unit (VPU) Plants, Protocols, Procedures and Requirements (P3R) Using Currently Existing Flight Resources (Lada-VPU-P3R) will optimize hardware and

  9. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    PubMed Central

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  10. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  11. TOPEX Microwave Radiometer - Thermal design verification test and analytical model validation

    NASA Technical Reports Server (NTRS)

    Lin, Edward I.

    1992-01-01

    The testing of the TOPEX Microwave Radiometer (TMR) is described in terms of hardware development based on the modeling and thermal vacuum testing conducted. The TMR and the vacuum-test facility are described, and the thermal verification test includes a hot steady-state segment, a cold steady-state segment, and a cold survival mode segment totalling 65 hours. A graphic description is given of the test history which is related temperature tracking, and two multinode TMR test-chamber models are compared to the test results. Large discrepancies between the test data and the model predictions are attributed to contact conductance, effective emittance from the multilayer insulation, and heat leaks related to deviations from the flight configuration. The TMR thermal testing/modeling effort is shown to provide technical corrections for the procedure outlined, and the need for validating predictive models is underscored.

  12. Validation of an updated fractionation and indirect speciation procedure for inorganic arsenic in oxic and suboxic soils and sediments.

    PubMed

    Lock, Alan; Wallschläger, Dirk; McMurdo, Colin; Tyler, Laura; Belzile, Nelson; Spiers, Graeme

    2016-12-01

    A sequential extraction procedure (SEP) for the speciation analysis of As(III) and As(V) in oxic and suboxic soils and sediments was validated using a natural lake sediment and three certified reference materials, as well as spike recoveries of As(III) and As(V). Many of the extraction steps have been previously validated making the procedure useful for comparisons to similar previous SEP studies. The novel aspect of this research is the validation for the SEP to maintain As(III) and As(V) species. The proposed five step extraction procedure includes the extraction agents (NH 4 ) 2 SO 4 , NH 4 H 2 PO 4 , H 3 PO 4  + NH 2 OH·HCl, oxalate + ascorbic acid (heated), and HNO 3  + HCl + HF, targeting operationally defined easily exchangeable, strongly sorbed, amorphous Fe oxide bound, crystalline Fe oxide bound, and residual As fractions, respectively. The third extraction step, H 3 PO 4  + NH 2 OH·HCl, has not been previously validated for fraction selectivity. We present evidence for this extraction step to target As complexed with amorphous Fe oxides when used in the SEP proposed here. All solutions were analyzed on ICP-MS. The greatest concentrations of As were extracted from the amorphous Fe oxide fraction and the dominant species was As(V). Lake sediment materials were found to have higher As(III) concentrations than the soil materials. Because different soils/sediments have different chemical characteristics, maintenance of As species during extractions must be validated for specific soil/sediment types using spiking experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    USGS Publications Warehouse

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  14. Solid sorbent air sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elskamp, C.J.; Schultz, G.R.

    1986-01-01

    A sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine was developed in order to avoid problems typically encountered in the sampling and analysis of low molecular weight aliphatic amines. Samples are collected with adsorbent tubes containing Amberlite XAD-7 resin coated with the derivatizing reagent, NBD chloride (7-chloro-4-nitrobenzo-2-oxa-1,3-diazole). Analysis is performed by high performance liquid chromatography with the use of a fluorescence and/or UV/visible detector. All four amines can be monitored simultaneously, and neither collection nor storage is affected by humidity. Samples are stable at room temperature for at least two weeks. The methodology has been tested for eachmore » of the four amines at sample loadings equivalent to air concentration ranges of 0.5 to 30 ppm for a sample volume of 10 liters. The method shows promise for determining other airborne primary and secondary low molecular weight aliphatic amines.« less

  15. Analytic Validation of RNA In Situ Hybridization (RISH) for AR and AR-V7 Expression in Human Prostate Cancer

    PubMed Central

    Guedes, Liana B.; Morais, Carlos L.; Almutairi, Fawaz; Haffner, Michael C.; Zheng, Qizhi; Isaacs, John T.; Antonarakis, Emmanuel S.; Lu, Changxue; Tsai, Harrison; Luo, Jun; De Marzo, Angelo M.; Lotan, Tamara L.

    2016-01-01

    Purpose RNA expression of androgen receptor splice variants may be a biomarker of resistance to novel androgen deprivation therapies in castrate resistant prostate cancer (CRPC). We analytically validated an RNA in situ hybridization (RISH) assay for total AR and AR-V7 for use in formalin fixed paraffin embedded (FFPE) prostate tumors. Experimental Design We used prostate cell lines and xenografts to validate chromogenic RISH to detect RNA containing AR exon 1 (AR-E1, surrogate for total AR RNA species) and cryptic exon 3 (AR-CE3, surrogate for AR-V7 expression). RISH signals were quantified in FFPE primary tumors and CRPC specimens, comparing to known AR and AR-V7 status by immunohistochemistry and RT-PCR. Results The quantified RISH results correlated significantly with total AR and AR-V7 levels by RT-PCR in cell lines, xenografts and autopsy metastases. Both AR-E1 and AR-CE3 RISH signals were localized in nuclear punctae in addition to the expected cytoplasmic speckles. Compared to admixed benign glands, AR-E1 expression was significantly higher in primary tumor cells with a median fold increase of 3.0 and 1.4 in two independent cohorts (p<0.0001 and p=0.04, respectively). While AR-CE3 expression was detectable in primary prostatic tumors, levels were substantially higher in a subset of CRPC metastases and cell lines, and were correlated with AR-E1 expression. Conclusions RISH for AR-E1 and AR-CE3 is an analytically valid method to examine total AR and AR-V7 RNA levels in FFPE tissues. Future clinical validation studies are required to determine whether AR RISH is a prognostic or predictive biomarker in specific clinical contexts. PMID:27166397

  16. A new procedure for investigating three-dimensional stress fields in a thin plate with a through-the-thickness crack

    NASA Astrophysics Data System (ADS)

    Yi, Dake; Wang, TzuChiang

    2018-06-01

    In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.

  17. Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview

    NASA Technical Reports Server (NTRS)

    Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard

    1996-01-01

    The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'

  18. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  19. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  20. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  1. Reliability and validity of procedure-based assessments in otolaryngology training.

    PubMed

    Awad, Zaid; Hayden, Lindsay; Robson, Andrew K; Muthuswamy, Keerthini; Tolley, Neil S

    2015-06-01

    To investigate the reliability and construct validity of procedure-based assessment (PBA) in assessing performance and progress in otolaryngology training. Retrospective database analysis using a national electronic database. We analyzed PBAs of otolaryngology trainees in North London from core trainees (CTs) to specialty trainees (STs). The tool contains six multi-item domains: consent, planning, preparation, exposure/closure, technique, and postoperative care, rated as "satisfactory" or "development required," in addition to an overall performance rating (pS) of 1 to 4. Individual domain score, overall calculated score (cS), and number of "development-required" items were calculated for each PBA. Receiver operating characteristic analysis helped determine sensitivity and specificity. There were 3,152 otolaryngology PBAs from 46 otolaryngology trainees analyzed. PBA reliability was high (Cronbach's α 0.899), and sensitivity approached 99%. cS correlated positively with pS and level in training (rs : +0.681 and +0.324, respectively). ST had higher cS and pS than CT (93% ± 0.6 and 3.2 ± 0.03 vs. 71% ± 3.1 and 2.3 ± 0.08, respectively; P < .001). cS and pS increased from CT1 to ST8 showing construct validity (rs : +0.348 and +0.354, respectively; P < .001). The technical skill domain had the highest utilization (98% of PBAs) and was the best predictor of cS and pS (rs : +0.96 and +0.66, respectively). PBA is reliable and valid for assessing otolaryngology trainees' performance and progress at all levels. It is highly sensitive in identifying competent trainees. The tool is used in a formative and feedback capacity. The technical domain is the best predictor and should be given close attention. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  2. Automatic computer procedure for generating exact and analytical kinetic energy operators based on the polyspherical approach: General formulation and removal of singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ndong, Mamadou; Lauvergnat, David; Nauts, André

    2013-11-28

    We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less

  3. Analytical procedures for the determination of selected trace elements in peat and plant samples by inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Krachler, Michael; Mohl, Carola; Emons, Hendrik; Shotyk, William

    2002-08-01

    A simple, robust and reliable analytical procedure for the determination of 15 elements, namely Ca, V, Cr, Mn, Co, Ni, Cu, Zn, Rb, Ag, Cd, Ba, Tl, Th and U in peat and plant materials by inductively coupled plasma-quadrupole mass spectrometry (ICP-QMS) was developed. Powdered sample aliquots of approximately 220 mg were dissolved with various acid mixtures in a microwave heated high-pressure autoclave capable to digest 40 samples simultaneously. The selection of appropriate amounts of digestion acids (nitric acid, hydrofluoric acid or tetrafluoroboric acid) was crucial to obtain accurate results. The optimized acid mixture for digestion of plant and peat samples consisted of 3 ml HNO 3 and 0.1 ml HBF 4. An ultrasonic nebulizer with an additional membrane desolvation unit was found beneficial for the determination of Co, Ni, Ag, Tl, Th and U, allowing to aspirate a dry sample aerosol into the ICP-QMS. A pneumatic cross flow nebulizer served as sample introduction device for the other elements. Internal standardization was achieved with 103Rh for all elements, except for Th whose ICP-QMS signals were corrected by 103Rh and 185Re. Quality control was ascertained by analysis of the certified plant reference material GBW 07602 Bush Branches and Leaves. In almost all cases HNO 3 alone could not fully liberate the analytes of interest from the peat or plant matrix, probably because of the silicates present. After adding small amounts (0.05-0.1 ml) of either HF or HBF 4 to the digestion mixture, concentrations quantified by ICP-QMS generally increased significantly, in the case of Rb up to 80%. Further increasing the volumes of HF or HBF 4 in turn, resulted in a loss of recoveries of almost all elements, some of which amounted to approximately 60%. The successful analytical procedures were applied to the determination of two bulk peat materials. In general, good agreement between the found concentrations and results from an inter-laboratory trial or from instrumental

  4. Semi-analytical Model for Estimating Absorption Coefficients of Optically Active Constituents in Coastal Waters

    NASA Astrophysics Data System (ADS)

    Wang, D.; Cui, Y.

    2015-12-01

    The objectives of this paper are to validate the applicability of a multi-band quasi-analytical algorithm (QAA) in retrieval absorption coefficients of optically active constituents in turbid coastal waters, and to further improve the model using a proposed semi-analytical model (SAA). The ap(531) and ag(531) semi-analytically derived using SAA model are quite different from the retrievals procedures of QAA model that ap(531) and ag(531) are semi-analytically derived from the empirical retrievals results of a(531) and a(551). The two models are calibrated and evaluated against datasets taken from 19 independent cruises in West Florida Shelf in 1999-2003, provided by SeaBASS. The results indicate that the SAA model produces a superior performance to QAA model in absorption retrieval. Using of the SAA model in retrieving absorption coefficients of optically active constituents from West Florida Shelf decreases the random uncertainty of estimation by >23.05% from the QAA model. This study demonstrates the potential of the SAA model in absorption coefficients of optically active constituents estimating even in turbid coastal waters. Keywords: Remote sensing; Coastal Water; Absorption Coefficient; Semi-analytical Model

  5. Analytical Approach Validation for the Spin-Stabilized Satellite Attitude

    NASA Technical Reports Server (NTRS)

    Zanardi, Maria Cecilia F. P. S.; Garcia, Roberta Veloso; Kuga, Helio Koiti

    2007-01-01

    An analytical approach for spin-stabilized spacecraft attitude prediction is presented for the influence of the residual magnetic torques and the satellite in an elliptical orbit. Assuming a quadripole model for the Earth s magnetic field, an analytical averaging method is applied to obtain the mean residual torque in every orbital period. The orbit mean anomaly is used to compute the average components of residual torque in the spacecraft body frame reference system. The theory is developed for time variations in the orbital elements, giving rise to many curvature integrals. It is observed that the residual magnetic torque does not have component along the spin axis. The inclusion of this torque on the rotational motion differential equations of a spin stabilized spacecraft yields conditions to derive an analytical solution. The solution shows that the residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spin axis of the spacecraft. The theory developed has been applied to the Brazilian s spin stabilized satellites, which are quite appropriated for verification and comparison of the theory with the data generated and processed by the Satellite Control Center of Brazil National Research Institute. The results show the period that the analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  6. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  7. A validity generalization procedure to test relations between intrinsic and extrinsic motivation and influence tactics.

    PubMed

    Barbuto, John E; Moss, Jennifer A

    2006-08-01

    The relations of intrinsic and extrinsic motivation with use of consultative, legitimating, and pressure influence tactics were examined using validity generalization procedures. 5 to 7 field studies with cumulative samples exceeding 800 were used to test each relationship. Significance was found for relation between agents' intrinsic motivation and their use of consultative influence tactics and agents' extrinsic motivation and their use of legitimating influence tactics.

  8. Scattering From the Finite-Length, Dielectric Circular Cylinder. Part 2 - On the Validity of an Analytical Solution for Characterizing Backscattering from Tree Trunks at P-Band

    DTIC Science & Technology

    2015-09-01

    accuracy of an analytical solution for characterizing the backscattering responses of circular cylindrical tree trunks located above a dielectric ground...Figures iv 1. Introduction 1 2. Analytical Solution 2 3. Validation with Full-Wave Solution 4 3.1 Untapered Circular Cylindrical Trunk 5 3.2...Linearly Tapered Circular Cylindrical Trunk 13 3.3 Nonlinearly Tapered Circular Cylindrical Trunk 18 4. Conclusions 22 5. References 23 Appendix

  9. Development of a new calibration procedure and its experimental validation applied to a human motion capture system.

    PubMed

    Royo Sánchez, Ana Cristina; Aguilar Martín, Juan José; Santolaria Mazo, Jorge

    2014-12-01

    Motion capture systems are often used for checking and analyzing human motion in biomechanical applications. It is important, in this context, that the systems provide the best possible accuracy. Among existing capture systems, optical systems are those with the highest accuracy. In this paper, the development of a new calibration procedure for optical human motion capture systems is presented. The performance and effectiveness of that new calibration procedure are also checked by experimental validation. The new calibration procedure consists of two stages. In the first stage, initial estimators of intrinsic and extrinsic parameters are sought. The camera calibration method used in this stage is the one proposed by Tsai. These parameters are determined from the camera characteristics, the spatial position of the camera, and the center of the capture volume. In the second stage, a simultaneous nonlinear optimization of all parameters is performed to identify the optimal values, which minimize the objective function. The objective function, in this case, minimizes two errors. The first error is the distance error between two markers placed in a wand. The second error is the error of position and orientation of the retroreflective markers of a static calibration object. The real co-ordinates of the two objects are calibrated in a co-ordinate measuring machine (CMM). The OrthoBio system is used to validate the new calibration procedure. Results are 90% lower than those from the previous calibration software and broadly comparable with results from a similarly configured Vicon system.

  10. Request Pattern, Pre-Analytical and Analytical Conditions of Urinalysis in Primary Care: Lessons from a One-Year Large-Scale Multicenter Study.

    PubMed

    Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos

    2018-06-01

    To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.

  11. A validated UPLC-MS/MS method for flibanserin in plasma and its pharmacokinetic interaction with bosentan in rats.

    PubMed

    Iqbal, Muzaffar; Ezzeldin, Essam; Rezk, Naser L; Bajrai, Amal A; Al-Rashood, Khalid A

    2018-04-25

    The purpose of this study was development, validation and application of ultra-performance liquid chromatography (UPLC)-ESI-MS/MS method for quantitation of flibanserin in plasma samples. After extraction of analyte from plasma by diethyl ether, separation was performed on UPLC C 18 column using mobile phase composition of 10 mM ammonium formate-acetonitrile (30:70, v/v) by isocratic elution of 0.3 ml/min. The multiple reaction monitoring transitions of m/z 391.13→ 161.04 and 384.20→ 253.06 were used for detection of analyte and internal standard (quetiapine), respectively. The calibration curves were linear (r ≥0.995) between 0.22 and 555 ng/ml concentration and all validation results were within the acceptable range as per US FDA guidelines. The assay procedure was fully validated and successfully applied in pharmacokinetic interaction study of flibanserin with bosentan in rats.

  12. Analytical Plan for Roman Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University ofmore » Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.« less

  13. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  14. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  15. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  16. URANS simulations of the tip-leakage cavitating flow with verification and validation procedures

    NASA Astrophysics Data System (ADS)

    Cheng, Huai-yu; Long, Xin-ping; Liang, Yun-zhi; Long, Yun; Ji, Bin

    2018-04-01

    In the present paper, the Vortex Identified Zwart-Gerber-Belamri (VIZGB) cavitation model coupled with the SST-CC turbulence model is used to investigate the unsteady tip-leakage cavitating flow induced by a NACA0009 hydrofoil. A qualitative comparison between the numerical and experimental results is made. In order to quantitatively evaluate the reliability of the numerical data, the verification and validation (V&V) procedures are used in the present paper. Errors of numerical results are estimated with seven error estimators based on the Richardson extrapolation method. It is shown that though a strict validation cannot be achieved, a reasonable prediction of the gross characteristics of the tip-leakage cavitating flow can be obtained. Based on the numerical results, the influence of the cavitation on the tip-leakage vortex (TLV) is discussed, which indicates that the cavitation accelerates the fusion of the TLV and the tip-separation vortex (TSV). Moreover, the trajectory of the TLV, when the cavitation occurs, is close to the side wall.

  17. Guideline for obtaining valid consent for gastrointestinal endoscopy procedures.

    PubMed

    Everett, Simon M; Griffiths, Helen; Nandasoma, U; Ayres, Katie; Bell, Graham; Cohen, Mike; Thomas-Gibson, Siwan; Thomson, Mike; Naylor, Kevin M T

    2016-10-01

    Much has changed since the last guideline of 2008, both in endoscopy and in the practice of obtaining informed consent, and it is vital that all endoscopists who are responsible for performing invasive and increasingly risky procedures are aware of the requirements for obtaining valid consent. This guideline is restricted to GI endoscopy but we cover elective and acute or emergency procedures. Few clinical trials have been carried out in relation to informed consent but most areas are informed by guidance from the General Medical Counsel (GMC) and/or are enshrined in legislation. Following an iterative voting process a series of recommendations have been drawn up that cover the majority of situations that will be encountered by endoscopists. This is not exhaustive and where doubt exists we have described where legal advice is likely to be required. This document relates to the law and endoscopy practice in the UK-where there is variation between the four devolved countries this is pointed out and endoscopists must be aware of the law where they practice. The recommendations are divided into consent for patients with and without capacity and we provide sections on provision of information and the consent process for patients in a variety of situations. This guideline is intended for use by all practitioners who request or perform GI endoscopy, or are involved in the pathway of such patients. If followed, we hope this document will enhance the experience of patients attending for endoscopy in UK units. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Validation of a method to detect cocaine and its metabolites in nails by gas chromatography-mass spectrometry.

    PubMed

    Valente-Campos, Simone; Yonamine, Mauricio; de Moraes Moreau, Regina Lucia; Silva, Ovandir Alves

    2006-06-02

    The objective of the present work was to compare previously published methods and provide validation data to detect simultaneously cocaine (COC), benzoylecgonine (BE) and norcocaine (NCOC) in nail. Finger and toenail samples (5mg) were cut in very small pieces and submitted to an initial procedure for external decontamination. Methanol (3 ml) was used to release analytes from the matrix. A cleanup step was performed simultaneously by solid-phase extraction (SPE) and the residue was derivatized with pentafluoropropionic anhydride/pentafluoropropanol (PFPA/PFP). Gas chromatography-mass spectrometry (GC-MS) was used to detect the analytes in selected ion monitoring mode (SIM). Confidence parameters of validation of the method were: recovery, intra- and inter-assay precision, as well as limit of detection (LOD) of the analytes. The limits of detection were: 3.5 ng/mg for NCOC and 3.0 ng/mg for COC and BE. Good intra-assay precision was observed for all detected substances (coefficient of variation (CV)<11%). The inter-assay precision for norcocaine and benzoylecgonine were <4%. For intra- and inter-assay precision deuterated internal standards were used. Toenail and fingernail samples from eight declared cocaine users were submitted to the validated method.

  19. Development and Validation of a Reliable and Robust Method for the Analysis of Cannabinoids and Terpenes in Cannabis.

    PubMed

    Giese, Matthew W; Lewis, Mark A; Giese, Laura; Smith, Kevin M

    2015-01-01

    The requirements for an acceptable cannabis assay have changed dramatically over the years resulting in a large number of laboratories using a diverse array of analytical methodologies that have not been properly validated. Due to the lack of sufficiently validated methods, we conducted a single- laboratory validation study for the determination of cannabinoids and terpenes in a variety of commonly occurring cultivars. The procedure involves high- throughput homogenization to prepare sample extract, which is then profiled for cannabinoids and terpenes by HPLC-diode array detector and GC-flame ionization detector, respectively. Spike recovery studies for terpenes in the range of 0.03-1.5% were carried out with analytical standards, while recovery studies for Δ9-tetrahydrocannabinolic acid, cannabidiolic acid, Δ9-tetrahydrocannabivarinic acid, and cannabigerolic acid and their neutral counterparts in the range of 0.3-35% were carried out using cannabis extracts. In general, accuracy at all levels was within 5%, and RSDs were less than 3%. The interday and intraday repeatabilities of the procedure were evaluated with five different cultivars of varying chemotype, again resulting in acceptable RSDs. As an example of the application of this assay, it was used to illustrate the variability seen in cannabis coming from very advanced indoor cultivation operations.

  20. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    PubMed

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Validity of plant fiber length measurement : a review of fiber length measurement based on kenaf as a model

    Treesearch

    James S. Han; Theodore Mianowski; Yi-yu Lin

    1999-01-01

    The efficacy of fiber length measurement techniques such as digitizing, the Kajaani procedure, and NIH Image are compared in order to determine the optimal tool. Kenaf bast fibers, aspen, and red pine fibers were collected from different anatomical parts, and the fiber lengths were compared using various analytical tools. A statistical analysis on the validity of the...

  2. Procedure for the Selection and Validation of a Calibration Model I-Description and Application.

    PubMed

    Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D

    2017-05-01

    Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional "fit and check the QCs accuracy" method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/x2 was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer-von Mises or Kolmogorov-Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS-MS results for the quantification of cocaine and naltrexone. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Validation of an isotope dilution, ICP-MS method based on internal mass bias correction for the determination of trace concentrations of Hg in sediment cores.

    PubMed

    Ciceri, E; Recchia, S; Dossi, C; Yang, L; Sturgeon, R E

    2008-01-15

    The development and validation of a method for the determination of mercury in sediments using a sector field inductively coupled plasma mass spectrometer (SF-ICP-MS) for detection is described. The utilization of isotope dilution (ID) calibration is shown to solve analytical problems related to matrix composition. Mass bias is corrected using an internal mass bias correction technique, validated against the traditional standard bracketing method. The overall analytical protocol is validated against NRCC PACS-2 marine sediment CRM. The estimated limit of detection is 12ng/g. The proposed procedure was applied to the analysis of a real sediment core sampled to a depth of 160m in Lake Como, where Hg concentrations ranged from 66 to 750ng/g.

  4. Validating Semi-analytic Models of High-redshift Galaxy Formation Using Radiation Hydrodynamical Simulations

    NASA Astrophysics Data System (ADS)

    Côté, Benoit; Silvia, Devin W.; O’Shea, Brian W.; Smith, Britton; Wise, John H.

    2018-05-01

    We use a cosmological hydrodynamic simulation calculated with Enzo and the semi-analytic galaxy formation model (SAM) GAMMA to address the chemical evolution of dwarf galaxies in the early universe. The long-term goal of the project is to better understand the origin of metal-poor stars and the formation of dwarf galaxies and the Milky Way halo by cross-validating these theoretical approaches. We combine GAMMA with the merger tree of the most massive galaxy found in the hydrodynamic simulation and compare the star formation rate, the metallicity distribution function (MDF), and the age–metallicity relationship predicted by the two approaches. We found that the SAM can reproduce the global trends of the hydrodynamic simulation. However, there are degeneracies between the model parameters, and more constraints (e.g., star formation efficiency, gas flows) need to be extracted from the simulation to isolate the correct semi-analytic solution. Stochastic processes such as bursty star formation histories and star formation triggered by supernova explosions cannot be reproduced by the current version of GAMMA. Non-uniform mixing in the galaxy’s interstellar medium, coming primarily from self-enrichment by local supernovae, causes a broadening in the MDF that can be emulated in the SAM by convolving its predicted MDF with a Gaussian function having a standard deviation of ∼0.2 dex. We found that the most massive galaxy in the simulation retains nearby 100% of its baryonic mass within its virial radius, which is in agreement with what is needed in GAMMA to reproduce the global trends of the simulation.

  5. Development and Analytical Validation of an Immunoassay for Quantifying Serum Anti-Pertussis Toxin Antibodies Resulting from Bordetella pertussis Infection ▿

    PubMed Central

    Menzies, Sandra L.; Kadwad, Vijay; Pawloski, Lucia C.; Lin, Tsai-Lien; Baughman, Andrew L.; Martin, Monte; Tondella, Maria Lucia C.; Meade, Bruce D.

    2009-01-01

    Adequately sensitive and specific methods to diagnose pertussis in adolescents and adults are not widely available. Currently, no Food and Drug Administration-approved diagnostic assays are available for the serodiagnosis of Bordetella pertussis. Since concentrations of B. pertussis-specific antibodies tend to be high during the later phases of disease, a simple, rapid, easily transferable serodiagnostic test was developed. This article describes test development, initial evaluation of a prototype kit enzyme-linked immunosorbent assay (ELISA) in an interlaboratory collaborative study, and analytical validation. The data presented here demonstrate that the kit met all prespecified criteria for precision, linearity, and accuracy for samples with anti-pertussis toxin (PT) immunoglobulin G (IgG) antibody concentrations in the range of 50 to 150 ELISA units (EU)/ml, the range believed to be most relevant for serodiagnosis. The assay met the precision and linearity criteria for a wider range, namely, from 50 to 200 EU/ml; however, the accuracy criterion was not met at 200 EU/ml. When the newly adopted World Health Organization International Standard for pertussis antiserum (human) reference reagent was used to evaluate accuracy, the accuracy criteria were met from 50 to 200 international units/ml. In conclusion, the IgG anti-PT ELISA met all assay validation parameters within the range considered most relevant for serodiagnosis. This ELISA was developed and analytically validated as a user-friendly kit that can be used in both qualitative and quantitative formats. The technology for producing the kit is transferable to public health laboratories. PMID:19864485

  6. Design for validation: An approach to systems validation

    NASA Technical Reports Server (NTRS)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  7. Development and validation of a liquid chromatography-tandem mass spectrometry analytical method for the therapeutic drug monitoring of eight novel anticancer drugs.

    PubMed

    Herbrink, M; de Vries, N; Rosing, H; Huitema, A D R; Nuijen, B; Schellens, J H M; Beijnen, J H

    2018-04-01

    To support therapeutic drug monitoring of patients with cancer, a fast and accurate method for simultaneous quantification of the registered anticancer drugs afatinib, axitinib, ceritinib, crizotinib, dabrafenib, enzalutamide, regorafenib and trametinib in human plasma using liquid chromatography tandem mass spectrometry was developed and validated. Human plasma samples were collected from treated patients and stored at -20°C. Analytes and internal standards (stable isotopically labeled analytes) were extracted with acetonitrile. An equal amount of 10 mm NH 4 CO 3 was added to the supernatant to yield the final extract. A 2 μL aliquot of this extract was injected onto a C 18 -column, gradient elution was applied and triple-quadrupole mass spectrometry in positive-ion mode was used for detection. All results were within the acceptance criteria of the latest US Food and Drug Administration guidance and European Medicines Agency guidelines on method validation, except for the carry-over of ceritinib and crizotinib. These were corrected for by the injection order of samples. Additional stability tests were carried out for axitinib and dabrafenib in relation to their reported photostability. In conclusion, the described method to simultaneously quantify the eight selected anticancer drugs in human plasma was successfully validated and applied for therapeutic drug monitoring in cancer patients treated with these drugs. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Syta, Olga; Rozum, Karol; Choińska, Marta; Zielińska, Dobrochna; Żukowska, Grażyna Zofia; Kijowska, Agnieszka; Wagner, Barbara

    2014-11-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th-14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers.

  9. Are we really measuring what we say we're measuring? Using video techniques to supplement traditional construct validation procedures.

    PubMed

    Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L

    2013-01-01

    Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Analytical procedure for the determination of Ethyl Lauroyl Arginate (LAE) to assess the kinetics and specific migration from a new antimicrobial active food packaging.

    PubMed

    Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina

    2012-10-01

    Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    PubMed

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  13. Recent trends in analytical procedures in forensic toxicology.

    PubMed

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  14. Extension and validation of an analytical model for in vivo PET verification of proton therapy—a phantom and clinical study

    NASA Astrophysics Data System (ADS)

    Attanasi, F.; Knopf, A.; Parodi, K.; Paganetti, H.; Bortfeld, T.; Rosso, V.; Del Guerra, A.

    2011-08-01

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner at MGH (Boston, USA), attention is now focused on an in-room PET application immediately after treatment in order to also detect shorter-lived isotopes, such as O15 and N13, minimizing isotope washout and avoiding patient repositioning errors. Clinical trials are being conducted by means of commercially available PET systems, and other tests are planned using application-dedicated tomographs. Parallel to the experimental investigation and new hardware development, great interest has been shown in the development of fast procedures to provide feedback regarding the delivered dose from reconstructed PET images. Since the thresholds of inelastic nuclear reactions leading to tissue β+-activation fall within the energy range of 15-20 MeV, the distal activity fall-off is correlated, but not directly matched, to the distal fall-off of the dose distribution. Moreover, the physical interactions leading to β+-activation and energy deposition are of a different nature. All these facts make it essential to further develop accurate and fast methodologies capable of predicting, on the basis of the planned dose distribution, expected PET images to be compared with actual PET measurements, thus providing clinical feedback on the correctness of the dose delivery and of the irradiation field position. The aim of this study has been to validate an analytical model and to implement and evaluate it in a fast and flexible framework able to locally predict such activity distributions directly taking the reference planning CT and planned dose as inputs. The results achieved in this study for phantoms and clinical cases highlighted the potential of the implemented method to predict expected

  15. Analytical validation of quantitative immunohistochemical assays of tumor infiltrating lymphocyte biomarkers.

    PubMed

    Singh, U; Cui, Y; Dimaano, N; Mehta, S; Pruitt, S K; Yearley, J; Laterza, O F; Juco, J W; Dogdas, B

    2018-06-04

    FOXP3 IHC assays met the fit-for-purpose analytical acceptance validation criteria and that they can be used to support clinical studies.

  16. Validation of standard operating procedures in a multicenter retrospective study to identify -omics biomarkers for chronic low back pain.

    PubMed

    Dagostino, Concetta; De Gregori, Manuela; Gieger, Christian; Manz, Judith; Gudelj, Ivan; Lauc, Gordan; Divizia, Laura; Wang, Wei; Sim, Moira; Pemberton, Iain K; MacDougall, Jane; Williams, Frances; Van Zundert, Jan; Primorac, Dragan; Aulchenko, Yurii; Kapural, Leonardo; Allegri, Massimo

    2017-01-01

    Chronic low back pain (CLBP) is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine). Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories) to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs) for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres) retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1) blood collection, (2) sample processing and storage, (3) shipping details and (4) cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.

  17. Multielement trace determination in SiC powders: assessment of interlaboratory comparisons aimed at the validation and standardization of analytical procedures with direct solid sampling based on ETV ICP OES and DC arc OES.

    PubMed

    Matschat, Ralf; Hassler, Jürgen; Traub, Heike; Dette, Angelika

    2005-12-01

    used to determine the repeatability and the reproducibility (expressed as standard deviations) of both methods. The calculation was carried out according to the related standard. The results are specified and discussed in this paper, as are the optimized analytical conditions determined and used by the authors of this paper. For both methods, the repeatability relative standard deviations were <25%, usually ~10%, and the reproducibility relative standard deviations were <35%, usually ~15%. These results were regarded as satifactory for both methods intended for rapid analysis of materials for which decomposition is difficult and time-consuming. Also described are some results from an interlaboratory comparison used to certify one of the materials that had been previously used for validation in both interlaboratory comparisons. Thirty laboratories (from eight countries) participated in this interlaboratory comparison for certification. As examples, accepted results are shown from laboratories that used ETV ICP OES or DC arc OES and had performed calibrations by using solutions or oxides, respectively. The certified mass fractions of the certified reference materials were also compared with the mass fractions determined in the interlaboratory comparisons performed within the framework of method standardization. Good agreement was found for most of the analytes.

  18. Two important limitations relating to the spiking of environmental samples with contaminants of emerging concern: How close to the real analyte concentrations are the reported recovered values?

    PubMed

    Michael, Costas; Bayona, Josep Maria; Lambropoulou, Dimitra; Agüera, Ana; Fatta-Kassinos, Despo

    2017-06-01

    Occurrence and effects of contaminants of emerging concern pose a special challenge to environmental scientists. The investigation of these effects requires reliable, valid, and comparable analytical data. To this effect, two critical aspects are raised herein, concerning the limitations of the produced analytical data. The first relates to the inherent difficulty that exists in the analysis of environmental samples, which is related to the lack of knowledge (information), in many cases, of the form(s) of the contaminant in which is present in the sample. Thus, the produced analytical data can only refer to the amount of the free contaminant ignoring the amount in which it may be present in other forms; e.g., as in chelated and conjugated form. The other important aspect refers to the way with which the spiking procedure is generally performed to determine the recovery of the analytical method. Spiking environmental samples, in particular solid samples, with standard solution followed by immediate extraction, as is the common practice, can lead to an overestimation of the recovery. This is so, because no time is given to the system to establish possible equilibria between the solid matter-inorganic and/or organic-and the contaminant. Therefore, the spiking procedure need to be reconsidered by including a study of the extractable amount of the contaminant versus the time elapsed between spiking and the extraction of the sample. This study can become an element of the validation package of the method.

  19. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    PubMed Central

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  20. Validation of Procedures for Monitoring Crewmember Immune Function

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Stowe, Raymond; Mehta, Satish; Uchakin, Peter; Quiriarte, Heather; Pierson, Duane; Sams, Clarence

    2008-01-01

    There is ample evidence to suggest that space flight leads to immune system dysregulation. This may be a result of microgravity, confinement, physiological stress, radiation, environment or other mission-associated factors. The clinical risk (if any) from prolonged immune dysregulation during exploration-class space flight has not yet been determined, but may include increased incidence of infection, allergy, hypersensitivity, hematological malignancy or altered wound healing. Each of the clinical events resulting from immune dysfunction has the potential to impact mission critical objectives during exploration-class missions. To date, precious little in-flight immune data has been generated to assess this phenomenon. The majority of recent flight immune studies have been post-flight assessments, which may not accurately reflect the in-flight status of immunity as it resolves over prolonged flight. There are no procedures currently in place to monitor immune function or its effect on crew health. The objective of this Supplemental Medical Objective (SMO) is to develop and validate an immune monitoring strategy consistent with operational flight requirements and constraints. This SMO will assess immunity, latent viral reactivation and physiological stress during both short and long duration flights. Upon completion, it is expected that any clinical risks resulting from the adverse effects of space flight on the human immune system will have been determined. In addition, a flight-compatible immune monitoring strategy will have been developed with which countermeasures validation could be performed. This study will determine, to the best level allowed by current technology, the in-flight status of crewmembers' immune systems. The in-flight samples will allow a distinction between legitimate in-flight alterations and the physiological stresses of landing and readaptation which are believed to alter R+0 assessments. The overall status of the immune system during flight

  1. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  2. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  3. A closer look at diagnosis in clinical dental practice: part 1. Reliability, validity, specificity and sensitivity of diagnostic procedures.

    PubMed

    Pretty, Iain A; Maupomé, Gerardo

    2004-04-01

    Dentists are involved in diagnosing disease in every aspect of their clinical practice. A range of tests, systems, guides and equipment--which can be generally referred to as diagnostic procedures--are available to aid in diagnostic decision making. In this era of evidence-based dentistry, and given the increasing demand for diagnostic accuracy and properly targeted health care, it is important to assess the value of such diagnostic procedures. Doing so allows dentists to weight appropriately the information these procedures supply, to purchase new equipment if it proves more reliable than existing equipment or even to discard a commonly used procedure if it is shown to be unreliable. This article, the first in a 6-part series, defines several concepts used to express the usefulness of diagnostic procedures, including reliability and validity, and describes some of their operating characteristics (statistical measures of performance), in particular, specificity and sensitivity. Subsequent articles in the series will discuss the value of diagnostic procedures used in daily dental practice and will compare today's most innovative procedures with established methods.

  4. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  5. Portable microwave assisted extraction: An original concept for green analytical chemistry.

    PubMed

    Perino, Sandrine; Petitcolas, Emmanuel; de la Guardia, Miguel; Chemat, Farid

    2013-11-08

    This paper describes a portable microwave assisted extraction apparatus (PMAE) for extraction of bioactive compounds especially essential oils and aromas directly in a crop or in a forest. The developed procedure, based on the concept of green analytical chemistry, is appropriate to obtain direct in-field information about the level of essential oils in natural samples and to illustrate green chemical lesson and research. The efficiency of this experiment was validated for the extraction of essential oil of rosemary directly in a crop and allows obtaining a quantitative information on the content of essential oil, which was similar to that obtained by conventional methods in the laboratory. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  7. The validation of procedures to assess prevocational task preferences in retarded adults.

    PubMed

    Mithaug, D E; Hanawalt, D A

    1978-01-01

    Three severely retarded young adults between the ages of 19 and 21 years participated in a prevocational training program, and worked regularly on six different tasks during the scheduled six-hour day. The study attempted to assess each subject's preferences for the six tasks: collating, stuffing, sorting, pulley assembly, flour-sifter assembly, and circuit-board stuffing. In Phase I, the procedure consisted of randomly pairing each task with all other tasks in a two-choice situation that required the subjects to select one task from each pair combination to work for a seven-minute period. The selection procedure consisted of presenting two representative task objects on a tray and requesting the subject to pick up one object and place it on the work table. The object selected represented the task worked for that period. The 15 possible pair combinations were presented randomly every two days for a period of 34 days to determine the preferences. During the validation phase (Phase II), each subject's least- and most-preferred tasks were paired separately with moderately-preferred tasks. As expected, these manipulations confirmed the baseline data, as choices for the moderately-preferred tasks decreased when consistently paired with the preferred tasks and increased when consistently paired with the least-preferred tasks.

  8. [Validation of measurement methods and estimation of uncertainty of measurement of chemical agents in the air at workstations].

    PubMed

    Dobecki, Marek

    2012-01-01

    This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.

  9. "In situ" extraction of essential oils by use of Dean-Stark glassware and a Vigreux column inside a microwave oven: a procedure for teaching green analytical chemistry.

    PubMed

    Chemat, Farid; Perino-Issartier, Sandrine; Petitcolas, Emmanuel; Fernandez, Xavier

    2012-08-01

    One of the principal objectives of sustainable and green processing development remains the dissemination and teaching of green chemistry in colleges, high schools, and academic laboratories. This paper describes simple glassware that illustrates the phenomenon of extraction in a conventional microwave oven as energy source and a process for green analytical chemistry. Simple glassware comprising a Dean-Stark apparatus (for extraction of aromatic plant material and recovery of essential oils and distilled water) and a Vigreux column (as an air-cooled condenser inside the microwave oven) was designed as an in-situ extraction vessel inside a microwave oven. The efficiency of this experiment was validated for extraction of essential oils from 30 g fresh orange peel, a by-product in the production of orange juice. Every laboratory throughout the world can use this equipment. The microwave power is 100 W and the irradiation time 15 min. The method is performed at atmospheric pressure without added solvent or water and furnishes essential oils similar to those obtained by conventional hydro or steam distillation. By use of GC-MS, 22 compounds in orange peel were separated and identified; the main compounds were limonene (72.1%), β-pinene (8.4%), and γ-terpinene (6.9%). This procedure is appropriate for the teaching laboratory, does not require any special microwave equipment, and enables the students to learn the skills of extraction, and chromatographic and spectroscopic analysis. They are also exposed to a dramatic visual example of rapid, sustainable, and green extraction of an essential oil, and are introduced to successful sustainable and green analytical chemistry.

  10. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  11. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  12. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  13. The NIH analytical methods and reference materials program for dietary supplements.

    PubMed

    Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M

    2007-09-01

    Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.

  14. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    PubMed

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  15. Clinical-scale validation of a new efficient procedure for cryopreservation of ex vivo expanded cord blood hematopoietic stem and progenitor cells.

    PubMed

    Duchez, Pascale; Rodriguez, Laura; Chevaleyre, Jean; De La Grange, Philippe Brunet; Ivanovic, Zoran

    2016-12-01

    Survival of ex vivo expanded hematopoietic stem cells (HSC) and progenitor cells is low with the standard cryopreservation procedure. We recently showed that the efficiency of cryopreservation of these cells may be greatly enhanced by adding a serum-free xeno-free culture medium (HP01 Macopharma), which improves the antioxidant and biochemical properties of the cryopreservation solution. Here we present the clinical-scale validation of this cryopreservation procedure. The hematopoietic cells expanded in clinical-scale cultures were cryopreserved applying the new HP01-based procedure. The viability, apoptosis rate and number of functional committed progenitors (methyl-cellulose colony forming cell test), short-term repopulating HSCs (primary recipient NSG mice) and long-term HSCs (secondary recipient NSG mice) were tested before and after thawing. The efficiency of clinical-scale procedure reproduced the efficiency of cryopreservation obtained earlier in miniature sample experiments. Furthermore, the full preservation of short- and long-term HSCs was obtained in clinical scale conditions. Because the results obtained in clinical-scale volume are comparable to our earlier results in miniature-scale cultures, the clinical-scale procedure should be considered validated. It allows cryopreservation of the whole ex vivo expanded culture content, conserving full short- and long-term HSC activity. Copyright © 2016 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  16. Development and analytical validation of a radioimmunoassay for the measurement of feline pancreatic lipase immunoreactivity in serum

    PubMed Central

    2004-01-01

    Abstract Pancreatitis is recognized as an important cause for morbidity and mortality in cats, but diagnosis remains difficult in many cases. As a first step in trying to identify a better diagnostic tool for feline pancreatitis the objective of this project was to develop and analytically validate a radioimmunoassay for the measurement of feline pancreatic lipase immunoreactivity (fPLI). Feline pancreatic lipase (fPL) was purified from pancreatic tissue and antiserum against fPL was raised in rabbits. Tracer was produced by iodination of fPL using the chloramine T method. A radioimmunoassay was established and analytically validated by determination of sensitivity, dilutional parallelism, spiking recovery, intra-assay variability, and interassay variability. A control range for fPLI in cat serum was established from 30 healthy cats using the central 95th percentile. The sensitivity of the assay was 1.2 μg/L. Observed to expected ratios for serial dilutions ranged from 98.8% to 164.3% for 3 different serum samples. Observed to expected ratios for spiking recovery ranged from 76.9% to 147.6% for 3 different serum samples. Coefficients of variation for intra- and interassay variability for 4 different serum samples were 10.1%, 4.5%, 2.2%, and 3.9% and 24.4%, 15.8%, 16.6%, and 21.3%, respectively. A reference range for fPLI was established as 1.2 to 3.8 μg/L. We conclude that the assay described is sensitive, accurate, and precise with limited linearity in the lower and limited reproducibility in the lower and higher end of the working range. Further studies to evaluate the clinical usefulness of this assay are needed and in progress. PMID:15581227

  17. 37Cl/35Cl isotope ratio analysis in perchlorate by ion chromatography/multi collector -ICPMS: Analytical performance and implication for biodegradation studies.

    PubMed

    Zakon, Yevgeni; Ronen, Zeev; Halicz, Ludwik; Gelman, Faina

    2017-10-01

    In the present study we propose a new analytical method for 37 Cl/ 35 Cl analysis in perchlorate by Ion Chromatography(IC) coupled to Multicollector Inductively Coupled Plasma Mass Spectrometry (MC-ICPMS). The accuracy of the analytical method was validated by analysis of international perchlorate standard materials USGS-37 and USGS -38; analytical precision better than ±0.4‰ was achieved. 37 Cl/ 35 Cl isotope ratio analysis in perchlorate during laboratory biodegradation experiment with microbial cultures enriched from the contaminated soil in Israel resulted in isotope enrichment factor ε 37 Cl = -13.3 ± 1‰, which falls in the range reported previously for perchlorate biodegradation by pure microbial cultures. The proposed analytical method may significantly simplify the procedure for isotope analysis of perchlorate which is currently applied in environmental studies. Copyright © 2017. Published by Elsevier Ltd.

  18. Clean Water Act Analytical Methods

    EPA Pesticide Factsheets

    EPA publishes laboratory analytical methods (test procedures) that are used by industries and municipalities to analyze the chemical, physical and biological components of wastewater and other environmental samples required by the Clean Water Act.

  19. Laboratory Workhorse: The Analytical Balance.

    ERIC Educational Resources Information Center

    Clark, Douglas W.

    1979-01-01

    This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)

  20. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    PubMed

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    PubMed

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. [Pre-analytical stage for biomarker assessment in breast cancer: 2014 update of the GEFPICS' guidelines in France].

    PubMed

    MacGrogan, Gaëtan; Mathieu, Marie-Christine; Poulet, Bruno; Penault-Llorca, Frédérique; Vincent-Salomon, Anne; Roger, Pascal; Treilleux, Isabelle; Valent, Alexander; Antoine, Martine; Becette, Véronique; Bor, Catherine; Brabencova, Eva; Charafe-Jauffret, Emmanuelle; Chenard, Marie-Pierre; Dauplat, Marie-Mélanie; Delrée, Paul; Devouassoux, Mojgan; Fiche, Maryse; Fondrevelle, Marie-Eve; Fridman, Viviana; Garbar, Christian; Genin, Pascal; Ghnassia, Jean-Pierre; Haudebourg, Juliette; Laberge-Le Couteulx, Sophie; Loussouarn, Delphine; Maran-Gonzalez, Aurélie; Marcy, Myriam; Michenet, Patrick; Sagan, Christine; Trassard, Martine; Verriele, Véronique; Arnould, Laurent; Lacroix-Triki, Magali

    2014-10-01

    Biomarker assessment of breast cancer tumor samples is part of the routine workflow of pathology laboratories. International guidelines have recently been updated, with special regards to the pre-analytical steps that are critical for the quality of immunohistochemical and in situ hybridization procedures, whatever the biomarker analyzed. Fixation and specimen handling protocols must be standardized, validated and carefully tracked. Cooperation and training of the personnel involved in the specimen workflow (e.g. radiologists, surgeons, nurses, technicians and pathologists) are of paramount importance. The GEFPICS' update of the recommendations herein details and comments the different steps of the pre-analytical process. Application of these guidelines and participation to quality insurance programs are mandatory to ensure the correct evaluation of oncotheranostic biomarkers. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  3. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    PubMed

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

    Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  4. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  5. Delphi Method Validation of a Procedural Performance Checklist for Insertion of an Ultrasound-Guided Internal Jugular Central Line.

    PubMed

    Hartman, Nicholas; Wittler, Mary; Askew, Kim; Manthey, David

    2016-01-01

    Placement of ultrasound-guided central lines is a critical skill for physicians in several specialties. Improving the quality of care delivered surrounding this procedure demands rigorous measurement of competency, and validated tools to assess performance are essential. Using the iterative, modified Delphi technique and experts in multiple disciplines across the United States, the study team created a 30-item checklist designed to assess competency in the placement of ultrasound-guided internal jugular central lines. Cronbach α was .94, indicating an excellent degree of internal consistency. Further validation of this checklist will require its implementation in simulated and clinical environments. © The Author(s) 2014.

  6. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Design optimization of an axial-field eddy-current magnetic coupling based on magneto-thermal analytical model

    NASA Astrophysics Data System (ADS)

    Fontchastagner, Julien; Lubin, Thierry; Mezani, Smaïl; Takorabet, Noureddine

    2018-03-01

    This paper presents a design optimization of an axial-flux eddy-current magnetic coupling. The design procedure is based on a torque formula derived from a 3D analytical model and a population algorithm method. The main objective of this paper is to determine the best design in terms of magnets volume in order to transmit a torque between two movers, while ensuring a low slip speed and a good efficiency. The torque formula is very accurate and computationally efficient, and is valid for any slip speed values. Nevertheless, in order to solve more realistic problems, and then, take into account the thermal effects on the torque value, a thermal model based on convection heat transfer coefficients is also established and used in the design optimization procedure. Results show the effectiveness of the proposed methodology.

  8. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  9. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  10. Analytic tests and their relation to jet fuel thermal stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heneghan, S.P.; Kauffman, R.E.

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less

  11. Validation of an Analytical Method for Determination of 13 priority polycyclic aromatic hydrocarbons in mineral water using dispersive liquid-liquid microextraction and GC-MS

    PubMed Central

    Sadeghi, Ramezan; Kobarfard, Farzad; Yazdanpanah, Hassan; Eslamizad, Samira; Bayat, Mitra

    2016-01-01

    Dispersive liquid-liquid microextraction (DLLME) combined with gas chromatography–mass spectrometry (GC–MS) was used for the extraction and determination of 13 polycyclic aromatic hydrocarbons (PAHs) in mineral water samples. In this procedure, the suitable combination of extraction solvent (500 µL chloroform) and disperser solvent (1000 µL acetone) were quickly injected into the water sample (10.00 mL) by Hamilton syringe. After centrifugation, 500 µL of the lower organic phase was dried under a gentle stream of nitrogen, re-dissolved in chloroform and injected into GC-MS. Chloroform and acetone were found to be the best extraction and disperser solvent, respectively. Validation of the method was performed using spiked calibration curves. The enrichment factor ranged from 93 to 129 and the recovery ranged from 71 to 90%. The linear ranges for all the PAHs were 0.10-2.80 ngmL-1. The relative standard deviations (RSDs) of PAHs in water by using anthracene-d10 as internal standard, were in the range of 4-11% for most of the analytes (n = 3). Limit of detection (LOD) for different PAHs were between 0.03 and 0.1 ngmL-1. The method was successfully applied for the analysis of PAHs in mineral water samples collected from Tehran. PMID:27610156

  12. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  13. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  15. Nonlinear modelling of high-speed catenary based on analytical expressions of cable and truss elements

    NASA Astrophysics Data System (ADS)

    Song, Yang; Liu, Zhigang; Wang, Hongrui; Lu, Xiaobing; Zhang, Jing

    2015-10-01

    Due to the intrinsic nonlinear characteristics and complex structure of the high-speed catenary system, a modelling method is proposed based on the analytical expressions of nonlinear cable and truss elements. The calculation procedure for solving the initial equilibrium state is proposed based on the Newton-Raphson iteration method. The deformed configuration of the catenary system as well as the initial length of each wire can be calculated. Its accuracy and validity of computing the initial equilibrium state are verified by comparison with the separate model method, absolute nodal coordinate formulation and other methods in the previous literatures. Then, the proposed model is combined with a lumped pantograph model and a dynamic simulation procedure is proposed. The accuracy is guaranteed by the multiple iterative calculations in each time step. The dynamic performance of the proposed model is validated by comparison with EN 50318, the results of the finite element method software and SIEMENS simulation report, respectively. At last, the influence of the catenary design parameters (such as the reserved sag and pre-tension) on the dynamic performance is preliminarily analysed by using the proposed model.

  16. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  17. Analytical validation of a reference laboratory ELISA for the detection of feline leukemia virus p27 antigen.

    PubMed

    Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J

    2017-09-01

    Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.

  18. A two-dimensional analytical model and experimental validation of garter stitch knitted shape memory alloy actuator architecture

    NASA Astrophysics Data System (ADS)

    Abel, Julianna; Luntz, Jonathan; Brei, Diann

    2012-08-01

    Active knits are a unique architectural approach to meeting emerging smart structure needs for distributed high strain actuation with simultaneous force generation. This paper presents an analytical state-based model for predicting the actuation response of a shape memory alloy (SMA) garter knit textile. Garter knits generate significant contraction against moderate to large loads when heated, due to the continuous interlocked network of loops of SMA wire. For this knit architecture, the states of operation are defined on the basis of the thermal and mechanical loading of the textile, the resulting phase change of the SMA, and the load path followed to that state. Transitions between these operational states induce either stick or slip frictional forces depending upon the state and path, which affect the actuation response. A load-extension model of the textile is derived for each operational state using elastica theory and Euler-Bernoulli beam bending for the large deformations within a loop of wire based on the stress-strain behavior of the SMA material. This provides kinematic and kinetic relations which scale to form analytical transcendental expressions for the net actuation motion against an external load. This model was validated experimentally for an SMA garter knit textile over a range of applied forces with good correlation for both the load-extension behavior in each state as well as the net motion produced during the actuation cycle (250% recoverable strain and over 50% actuation). The two-dimensional analytical model of the garter stitch active knit provides the ability to predict the kinetic actuation performance, providing the basis for the design and synthesis of large stroke, large force distributed actuators that employ this novel architecture.

  19. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  20. Modern analytical chemistry in the contemporary world

    NASA Astrophysics Data System (ADS)

    Šíma, Jan

    2016-12-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.

  1. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  2. Nonclinical dose formulation analysis method validation and sample analysis.

    PubMed

    Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D

    2010-12-01

    Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.

  3. Multivariate optimization and validation of an analytical methodology by RP-HPLC for the determination of losartan potassium in capsules.

    PubMed

    Bonfilio, Rudy; Tarley, César Ricardo Teixeira; Pereira, Gislaine Ribeiro; Salgado, Hérida Regina Nunes; de Araújo, Magali Benjamim

    2009-11-15

    This paper describes the optimization and validation of an analytical methodology for the determination of losartan potassium in capsules by HPLC using 2(5-1) fractional factorial and Doehlert designs. This multivariate approach allows a considerable improvement in chromatographic performance using fewer experiments, without additional cost for columns or other equipment. The HPLC method utilized potassium phosphate buffer (pH 6.2; 58 mmol L(-1))-acetonitrile (65:35, v/v) as the mobile phase, pumped at a flow rate of 1.0 mL min(-1). An octylsilane column (100 mm x 4.6mm i.d., 5 microm) maintained at 35 degrees C was used as the stationary phase. UV detection was performed at 254 nm. The method was validated according to the ICH guidelines, showing accuracy, precision (intra-day relative standard deviation (R.S.D.) and inter-day R.S.D values <2.0%), selectivity, robustness and linearity (r=0.9998) over a concentration range from 30 to 70 mg L(-1) of losartan potassium. The limits of detection and quantification were 0.114 and 0.420 mg L(-1), respectively. The validated method may be used to quantify losartan potassium in capsules and to determine the stability of this drug.

  4. A new method for constructing analytic elements for groundwater flow.

    NASA Astrophysics Data System (ADS)

    Strack, O. D.

    2007-12-01

    The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.

  5. Safety validation test equipment operation

    NASA Astrophysics Data System (ADS)

    Kurosaki, Tadaaki; Watanabe, Takashi

    1992-08-01

    An overview of the activities conducted on safety validation test equipment operation for materials used for NASA manned missions is presented. Safety validation tests, such as flammability, odor, offgassing, and so forth were conducted in accordance with NASA-NHB-8060.1C using test subjects common with those used by NASA, and the equipment used were qualified for their functions and performances in accordance with NASDA-CR-99124 'Safety Validation Test Qualification Procedures.' Test procedure systems were established by preparing 'Common Procedures for Safety Validation Test' as well as test procedures for flammability, offgassing, and odor tests. The test operation organization chaired by the General Manager of the Parts and Material Laboratory of NASDA (National Space Development Agency of Japan) was established, and the test leaders and operators in the organization were qualified in accordance with the specified procedures. One-hundred-one tests had been conducted so far by the Parts and Material Laboratory according to the request submitted by the manufacturers through the Space Station Group and the Safety and Product Assurance for Manned Systems Office.

  6. Experimentally validated mathematical model of analyte uptake by permeation passive samplers.

    PubMed

    Salim, F; Ioannidis, M; Górecki, T

    2017-11-15

    A mathematical model describing the sampling process in a permeation-based passive sampler was developed and evaluated numerically. The model was applied to the Waterloo Membrane Sampler (WMS), which employs a polydimethylsiloxane (PDMS) membrane as a permeation barrier, and an adsorbent as a receiving phase. Samplers of this kind are used for sampling volatile organic compounds (VOC) from air and soil gas. The model predicts the spatio-temporal variation of sorbed and free analyte concentrations within the sampler components (membrane, sorbent bed and dead volume), from which the uptake rate throughout the sampling process can be determined. A gradual decline in the uptake rate during the sampling process is predicted, which is more pronounced when sampling higher concentrations. Decline of the uptake rate can be attributed to diminishing analyte concentration gradient within the membrane, which results from resistance to mass transfer and the development of analyte concentration gradients within the sorbent bed. The effects of changing the sampler component dimensions on the rate of this decline in the uptake rate can be predicted from the model. Performance of the model was evaluated experimentally for sampling of toluene vapors under controlled conditions. The model predictions proved close to the experimental values. The model provides a valuable tool to predict changes in the uptake rate during sampling, to assign suitable exposure times at different analyte concentration levels, and to optimize the dimensions of the sampler in a manner that minimizes these changes during the sampling period.

  7. The Shedler-Westen Assessment Procedure (SWAP): Evaluating Psychometric Questions about Its Reliability, Validity, and Impact of Its Fixed Score Distribution

    ERIC Educational Resources Information Center

    Blagov, Pavel S.; Bi, Wu; Shedler, Jonathan; Westen, Drew

    2012-01-01

    The Shedler-Westen Assessment Procedure (SWAP) is a personality assessment instrument designed for use by expert clinical assessors. Critics have raised questions about its psychometrics, most notably its validity across observers and situations, the impact of its fixed score distribution on research findings, and its test-retest reliability. We…

  8. Teachable, high-content analytics for live-cell, phase contrast movies.

    PubMed

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  9. Current projects in Pre-analytics: where to go?

    PubMed

    Sapino, Anna; Annaratone, Laura; Marchiò, Caterina

    2015-01-01

    The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.

  10. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  11. Analytical and numerical analysis of frictional damage in quasi brittle materials

    NASA Astrophysics Data System (ADS)

    Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.

    2016-07-01

    Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.

  12. Analytic modeling of aerosol size distributions

    NASA Technical Reports Server (NTRS)

    Deepack, A.; Box, G. P.

    1979-01-01

    Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.

  13. Examination of efficacious, efficient, and socially valid error-correction procedures to teach sight words and prepositions to children with autism spectrum disorder.

    PubMed

    Kodak, Tiffany; Campbell, Vincent; Bergmann, Samantha; LeBlanc, Brittany; Kurtz-Nelson, Eva; Cariveau, Tom; Haq, Shaji; Zemantic, Patricia; Mahon, Jacob

    2016-09-01

    Prior research shows that learners have idiosyncratic responses to error-correction procedures during instruction. Thus, assessments that identify error-correction strategies to include in instruction can aid practitioners in selecting individualized, efficacious, and efficient interventions. The current investigation conducted an assessment to compare 5 error-correction procedures that have been evaluated in the extant literature and are common in instructional practice for children with autism spectrum disorder (ASD). Results showed that the assessment identified efficacious and efficient error-correction procedures for all participants, and 1 procedure was efficient for 4 of the 5 participants. To examine the social validity of error-correction procedures, participants selected among efficacious and efficient interventions in a concurrent-chains assessment. We discuss the results in relation to prior research on error-correction procedures and current instructional practices for learners with ASD. © 2016 Society for the Experimental Analysis of Behavior.

  14. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  15. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  16. Development of procedure for measurement of Pb isotope ratios in seawater by application of seaFAST sample pre-treatment system and Sector Field Inductively Coupled Plasma Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Vassileva, Emilia; Wysocka, Irena

    2016-12-01

    Anthropogenic Pb in the oceans, derived from high-temperature industrial processes, fuel combustion and incineration can have an isotopic signature distinct from naturally occurring Pb, supplied by rock weathering. To identify the different pollution sources accurately and to quantify their relative contributions, Pb isotope ratios are widely used. Due to the high salt content (approximately 3.5% of total dissolved solids) and very low levels of Pb (typically from 1 to 100 ng L- 1) in seawater the determination of Pb isotope ratios requires preliminary matrix separation and analyte preconcentration. An analytical protocol for the measurements of Pb isotope ratios in seawater combining seaFAST sample pre-treatment system and Sector Field Inductively Coupled Plasma Mass Spectrometry (SF ICP-MS) was developed. The application of seaFAST system was advantageous, because of its completely closed working cycle and small volumes of chemicals introduced in pre-treatment step, resulting in very low detection limits and procedural blanks. The preconcentration/matrix separation step was also of crucial importance for minimizing the isobaric and matrix interferences, coming from the seawater. In order to differentiate between anthropogenic and natural Pb sources, particular attention was paid to the determination of 204Pb isotope because of its implication in some geological interpretations. The validation of the analytical procedure was effectuated according to the recommendations of the ISO/IEC 17025 standard. The method was validated by processing the common Pb isotope reference material NIST SRM 981. All major sources of uncertainty were identified and propagated together following the ISO/GUM guidelines. The estimation of the total uncertainty associated to each measurement result was fundamental tool for sorting the main sources of possible biases. The developed analytical procedure was applied to the coastal and open seawater samples, collected in different regions of

  17. Selecting Statistical Quality Control Procedures for Limiting the Impact of Increases in Analytical Random Error on Patient Safety.

    PubMed

    Yago, Martín

    2017-05-01

    QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.

  18. PESTICIDE ANALYTICAL METHODS TO SUPPORT DUPLICATE-DIET HUMAN EXPOSURE MEASUREMENTS

    EPA Science Inventory

    Historically, analytical methods for determination of pesticides in foods have been developed in support of regulatory programs and are specific to food items or food groups. Most of the available methods have been developed, tested and validated for relatively few analytes an...

  19. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  20. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  1. 40 CFR 761.392 - Preparing validation study samples.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... establish a surface concentration to be included in the standard operating procedure. The surface levels of... Under § 761.79(d)(4) § 761.392 Preparing validation study samples. (a)(1) To validate a procedure to... surfaces must be ≥20 µg/100 cm2. (2) To validate a procedure to decontaminate a specified surface...

  2. Analytic validation and real-time clinical application of an amplicon-based targeted gene panel for advanced cancer

    PubMed Central

    Wing, Michele R.; Reeser, Julie W.; Smith, Amy M.; Reeder, Matthew; Martin, Dorrelyn; Jewell, Benjamin M.; Datta, Jharna; Miya, Jharna; Monk, J. Paul; Mortazavi, Amir; Otterson, Gregory A.; Goldberg, Richard M.; VanDeusen, Jeffrey B.; Cole, Sharon; Dittmar, Kristin; Jaiswal, Sunny; Kinzie, Matthew; Waikhom, Suraj; Freud, Aharon G.; Zhou, Xiao-Ping; Chen, Wei; Bhatt, Darshna; Roychowdhury, Sameek

    2017-01-01

    Multiplex somatic testing has emerged as a strategy to test patients with advanced cancer. We demonstrate our analytic validation approach for a gene hotspot panel and real-time prospective clinical application for any cancer type. The TruSight Tumor 26 assay amplifies 85 somatic hotspot regions across 26 genes. Using cell line and tumor mixes, we observed that 100% of the 14,715 targeted bases had at least 1000x raw coverage. We determined the sensitivity (100%, 95% CI: 96-100%), positive predictive value (100%, 95% CI: 96-100%), reproducibility (100% concordance), and limit of detection (3% variant allele frequency at 1000x read depth) of this assay to detect single nucleotide variants and small insertions and deletions. Next, we applied the assay prospectively in a clinical tumor sequencing study to evaluate 174 patients with metastatic or advanced cancer, including frozen tumors, formalin-fixed tumors, and enriched peripheral blood mononuclear cells in hematologic cancers. We reported one or more somatic mutations in 89 (53%) of the sequenced tumors (167 passing quality filters). Forty-three of these patients (26%) had mutations that would enable eligibility for targeted therapies. This study demonstrates the validity and feasibility of applying TruSight Tumor 26 for pan-cancer testing using multiple specimen types. PMID:29100271

  3. Verification and Validation of EnergyPlus Phase Change Material Model for Opaque Wall Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCMs) represent a technology that may reduce peak loads and HVAC energy consumption in buildings. A few building energy simulation programs have the capability to simulate PCMs, but their accuracy has not been completely tested. This study shows the procedure used to verify and validate the PCM model in EnergyPlus using a similar approach as dictated by ASHRAE Standard 140, which consists of analytical verification, comparative testing, and empirical validation. This process was valuable, as two bugs were identified and fixed in the PCM model, and version 7.1 of EnergyPlus will have a validated PCM model. Preliminarymore » results using whole-building energy analysis show that careful analysis should be done when designing PCMs in homes, as their thermal performance depends on several variables such as PCM properties and location in the building envelope.« less

  4. New procedure of quantitative mapping of Ti and Al released from dental implant and Mg, Ca, Fe, Zn, Cu, Mn as physiological elements in oral mucosa by LA-ICP-MS.

    PubMed

    Sajnóg, Adam; Hanć, Anetta; Koczorowski, Ryszard; Barałkiewicz, Danuta

    2017-12-01

    A new procedure for determination of elements derived from titanium implants and physiological elements in soft tissues by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is presented. The analytical procedure was developed which involved preparation of in-house matrix matched solid standards with analyte addition based on certified reference material (CRM) MODAS-4 Cormorant Tissue. Addition of gelatin, serving as a binding agent, essentially improved physical properties of standards. Performance of the analytical method was assayed and validated by calculating parameters like precision, detection limits, trueness and recovery of analyte addition using additional CRM - ERM-BB184 Bovine Muscle. Analyte addition was additionally confirmed by microwave digestion of solid standards and analysis by solution nebulization ICP-MS. The detection limits are in range 1.8μgg -1 to 450μgg -1 for Mn and Ca respectively. The precision values range from 7.3% to 42% for Al and Zn respectively. The estimated recoveries of analyte addition line within scope of 83%-153% for Mn and Cu respectively. Oral mucosa samples taken from patients treated with titanium dental implants were examined using developed analytical method. Standards and tissue samples were cryocut into 30µm thin sections. LA-ICP-MS allowed to obtain two-dimensional maps of distribution of elements in tested samples which revealed high content of Ti and Al derived from implants. Photographs from optical microscope displayed numerous particles with µm size in oral mucosa samples which suggests that they are residues from implantation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  7. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  8. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  9. Validation Process Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John E.; English, Christine M.; Gesick, Joshua C.

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  10. Validation of the Modified Surgeon Periorbital Rating of Edema and Ecchymosis (SPREE) Questionnaire: A Prospective Analysis of Facial Plastic and Reconstructive Surgery Procedures.

    PubMed

    Oliver, Jeremie D; Menapace, Deanna; Younes, Ahmed; Recker, Chelsey; Hamilton, Grant; Friedman, Oren

    2018-02-01

    Although periorbital edema and ecchymosis are commonly encountered after facial plastic and reconstructive surgery procedures, there is currently no validated grading scale to qualify these findings. In this study, the modified "Surgeon Periorbital Rating of Edema and Ecchymosis (SPREE)" questionnaire is used as a grading scale for patients undergoing facial plastic surgery procedures. This article aims to validate a uniform grading scale for periorbital edema and ecchymosis using the modified SPREE questionnaire in the postoperative period. This is a prospective study including 82 patients at two different routine postoperative visits (second and seventh postoperative days), wherein the staff and resident physicians, physician assistants (PAs), patients, and any accompanying adults were asked to use the modified SPREE questionnaire to score edema and ecchymosis of each eye of the patient who had undergone a plastic surgery procedure. Interrater and intrarater agreements were then examined. Cohen's kappa coefficient was calculated to measure intrarater and interrater agreement between health care professionals (staff physicians and resident physicians); staff physicians and PAs; and staff physicians, patients, and accompanying adults. Good to excellent agreement was identified between staff physicians and resident physicians as well as between staff physicians and PAs. There was, however, poor agreement between staff physicians, patients, and accompanying adults. In addition, excellent agreement was found for intraobserver reliability during same-day visits. The modified SPREE questionnaire is a validated grading system for use by health care professionals to reliably rate periorbital edema and ecchymosis in the postoperative period. Validation of the modified SPREE questionnaire may improve ubiquity in medical literature reporting and related outcomes reporting in future. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  11. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  12. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  13. Validating a faster method for reconstitution of Crotalidae Polyvalent Immune Fab (ovine).

    PubMed

    Gerring, David; King, Thomas R; Branton, Richard

    2013-07-01

    Reconstitution of CroFab(®) (Crotalidae Polyvalent Immune Fab [ovine]) lyophilized drug product was previously performed using 10 mL sterile water for injection followed by up to 36 min of gentle swirling of the vial. CroFab has been clinically demonstrated to be most effective when administered within 6 h of snake envenomation, and improved clinical outcomes are correlated with quicker timing of administration. An alternate reconstitution method was devised, using 18 mL 0.9% saline with manual inversion, with the goal of shortening reconstitution time while maintaining a high quality, efficacious product. An analytical study was designed to compare the physicochemical properties of 3 separate batches of CroFab when reconstituted using the standard procedure (10 mL WFI with gentle swirling) and a modified rapid procedure using 18 mL 0.9% saline and manual inversion. The physical and chemical characteristics of the same 3 batches were assessed using various analytic methodologies associated with routine quality control release testing. In addition further analytical methodologies were applied in order to elucidate possible structural changes that may be induced by the changed reconstitution procedure. Batches A, B, and C required mean reconstitution times of 25 min 51 s using the label method and 3 min 07 s (a 88.0% mean decrease) using the modified method. Physicochemical characteristics (color and clarity, pH, purity, protein content, potency) were found to be highly comparable. Characterization assays (dynamic light scattering, analytical ultracentrifugation, LC-MS, SDS-PAGE and circular dichroism spectroscopy were also all found to be comparable between methods. When comparing CroFab batches that were reconstituted using the labeled and modified methods, the physicochemical and biological (potency) characteristics of CroFab were not significantly changed when challenged by the various standard analytical methodologies applied in routine quality control analysis

  14. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    NASA Astrophysics Data System (ADS)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  15. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    PubMed

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. THE VALIDITY OF HUMAN AND COMPUTERIZED WRITING ASSESSMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring

    2005-09-01

    This paper summarizes an experiment designed to assess the validity of essay grading between holistic and analytic human graders and a computerized grader based on latent semantic analysis. The validity of the grade was gauged by the extent to which the student’s knowledge of the topic correlated with the grader’s expert knowledge. To assess knowledge, Pathfinder networks were generated by the student essay writers, the holistic and analytic graders, and the computerized grader. It was found that the computer generated grades more closely matched the definition of valid grading than did human generated grades.

  17. An analytical design procedure for the determination of effective leading edge extensions on thick delta wings

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Chaturvedi, S. K.

    1984-01-01

    An analytical design procedure for leading edge extensions (LEE) was developed for thick delta wings. This LEE device is designed to be mounted to a wing along the pseudo-stagnation stream surface associated with the attached flow design lift coefficient of greater than zero. The intended purpose of this device is to improve the aerodynamic performance of high subsonic and low supersonic aircraft at incidences above that of attached flow design lift coefficient, by using a vortex system emanating along the leading edges of the device. The low pressure associated with these vortices would act on the LEE upper surface and the forward facing area at the wing leading edges, providing an additional lift and effective leading edge thrust recovery. The first application of this technique was to a thick, round edged, twisted and cambered wing of approximately triangular planform having a sweep of 58 deg and aspect ratio of 2.30. The panel aerodynamics and vortex lattice method with suction analogy computer codes were employed to determine the pseudo-stagnation stream surface and an optimized LEE planform shape.

  18. Validated spectroscopic methods for determination of anti-histaminic drug azelastine in pure form: Analytical application for quality control of its pharmaceutical preparations.

    PubMed

    El-Masry, Amal A; Hammouda, Mohammed E A; El-Wasseef, Dalia R; El-Ashry, Saadia M

    2018-02-15

    Two simple, sensitive, rapid, validated and cost effective spectroscopic methods were established for quantification of antihistaminic drug azelastine (AZL) in bulk powder as well as in pharmaceutical dosage forms. In the first method (A) the absorbance difference between acidic and basic solutions was measured at 228nm, whereas in the second investigated method (B) the binary complex formed between AZL and Eosin Y in acetate buffer solution (pH3) was measured at 550nm. Different criteria that have critical influence on the intensity of absorption were deeply studied and optimized so as to achieve the highest absorption. The proposed methods obeyed Beer ' s low in the concentration range of (2.0-20.0μg·mL -1 ) and (0.5-15.0μg·mL -1 ) with % recovery±S.D. of (99.84±0.87), (100.02±0.78) for methods (A) and (B), respectively. Furthermore, the proposed methods were easily applied for quality control of pharmaceutical preparations without any conflict with its co-formulated additives, and the analytical results were compatible with those obtained by the comparison one with no significant difference as insured by student's t-test and the variance ratio F-test. Validation of the proposed methods was performed according the ICH guidelines in terms of linearity, limit of quantification, limit of detection, accuracy, precision and specificity, where the analytical results were persuasive. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Sewage-based epidemiology in monitoring the use of new psychoactive substances: Validation and application of an analytical method using LC-MS/MS.

    PubMed

    Kinyua, Juliet; Covaci, Adrian; Maho, Walid; McCall, Ann-Kathrin; Neels, Hugo; van Nuijs, Alexander L N

    2015-09-01

    Sewage-based epidemiology (SBE) employs the analysis of sewage to detect and quantify drug use within a community. While SBE has been applied repeatedly for the estimation of classical illicit drugs, only few studies investigated new psychoactive substances (NPS). These compounds mimic effects of illicit drugs by introducing slight modifications to chemical structures of controlled illicit drugs. We describe the optimization, validation, and application of an analytical method using liquid chromatography coupled to positive electrospray tandem mass spectrometry (LC-ESI-MS/MS) for the determination of seven NPS in sewage: methoxetamine (MXE), butylone, ethylone, methylone, methiopropamine (MPA), 4-methoxymethamphetamine (PMMA), and 4-methoxyamphetamine (PMA). Sample preparation was performed using solid-phase extraction (SPE) with Oasis MCX cartridges. The LC separation was done with a HILIC (150 x 3 mm, 5 µm) column which ensured good resolution of the analytes with a total run time of 19 min. The lower limit of quantification (LLOQ) was between 0.5 and 5 ng/L for all compounds. The method was validated by evaluating the following parameters: sensitivity, selectivity, linearity, accuracy, precision, recoveries and matrix effects. The method was applied on sewage samples collected from sewage treatment plants in Belgium and Switzerland in which all investigated compounds were detected, except MPA and PMA. Furthermore, a consistent presence of MXE has been observed in most of the sewage samples at levels higher than LLOQ. Copyright © 2015 John Wiley & Sons, Ltd.

  20. A hypothesis-driven physical examination learning and assessment procedure for medical students: initial validity evidence.

    PubMed

    Yudkowsky, Rachel; Otaki, Junji; Lowenstein, Tali; Riddle, Janet; Nishigori, Hiroshi; Bordage, Georges

    2009-08-01

    Diagnostic accuracy is maximised by having clinical signs and diagnostic hypotheses in mind during the physical examination (PE). This diagnostic reasoning approach contrasts with the rote, hypothesis-free screening PE learned by many medical students. A hypothesis-driven PE (HDPE) learning and assessment procedure was developed to provide targeted practice and assessment in anticipating, eliciting and interpreting critical aspects of the PE in the context of diagnostic challenges. This study was designed to obtain initial content validity evidence, performance and reliability estimates, and impact data for the HDPE procedure. Nineteen clinical scenarios were developed, covering 160 PE manoeuvres. A total of 66 Year 3 medical students prepared for and encountered three clinical scenarios during required formative assessments. For each case, students listed anticipated positive PE findings for two plausible diagnoses before examining the patient; examined a standardised patient (SP) simulating one of the diagnoses; received immediate feedback from the SP, and documented their findings and working diagnosis. The same students later encountered some of the scenarios during their Year 4 clinical skills examination. On average, Year 3 students anticipated 65% of the positive findings, correctly performed 88% of the PE manoeuvres and documented 61% of the findings. Year 4 students anticipated and elicited fewer findings overall, but achieved proportionally more discriminating findings, thereby more efficiently achieving a diagnostic accuracy equivalent to that of students in Year 3. Year 4 students performed better on cases on which they had received feedback as Year 3 students. Twelve cases would provide a reliability of 0.80, based on discriminating checklist items only. The HDPE provided medical students with a thoughtful, deliberate approach to learning and assessing PE skills in a valid and reliable manner.

  1. Using the Multiple-Choice Procedure to Measure the Relative Reinforcing Efficacy of Gambling: Initial Validity Evidence Among College Students.

    PubMed

    Butler, Leon H; Irons, Jessica G; Bassett, Drew T; Correia, Christopher J

    2018-06-01

    The multiple choice procedure (MCP) is used to assess the relative reinforcing value of concurrently available stimuli. The MCP was originally developed to assess the reinforcing value of drugs; the current within-subjects study employed the MCP to assess the reinforcing value of gambling behavior. Participants (N = 323) completed six versions of the MCP that presented hypothetical choices between money to be used while gambling ($10 or $25) versus escalating amounts of guaranteed money available immediately or after delays of either 1 week or 1 month. Results suggest that choices on the MCP are correlated with other measures of gambling behavior, thus providing concurrent validity data for using the MCP to quantify the relative reinforcing value of gambling. The MCP for gambling also displayed sensitivity to reinforcer magnitude and delay effects, which provides evidence of criterion validity. The results are consistent with a behavioral economic model of addiction and suggest that the MCP could be a valid tool for future research on gambling behavior.

  2. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations.

    PubMed

    Davidson, Scott E; Cui, Jing; Kry, Stephen; Deasy, Joseph O; Ibbott, Geoffrey S; Vicic, Milos; White, R Allen; Followill, David S

    2016-08-01

    A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today's modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was

  3. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Scott E., E-mail: sedavids@utmb.edu

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who usesmore » these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the

  4. Predictive analytics and child protection: constraints and opportunities.

    PubMed

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Sample Collection Procedures and Strategies

    EPA Pesticide Factsheets

    Individuals responsible for collecting environmental and building material samples following a contamination incident, can use these procedures to plan for and/or collect samples for analysis using the analytical methods listed in EPA's SAM

  6. In Defense of an Instrument-Based Approach to Validity

    ERIC Educational Resources Information Center

    Hood, S. Brian

    2012-01-01

    Paul E. Newton argues in favor of a conception of validity, viz, "the consensus definition of validity," according to which the extension of the predicate "is valid" is a subset of "assessment-based decision-making procedure[s], which [are] underwritten by an argument that the assessment procedure can be used to measure the attribute entailed by…

  7. Development and validation of an UHPLC-MS/MS method for β2-agonists quantification in human urine and application to clinical samples.

    PubMed

    Bozzolino, Cristina; Leporati, Marta; Gani, Federica; Ferrero, Cinzia; Vincenti, Marco

    2018-02-20

    A fast analytical method for the simultaneous detection of 24 β 2 -agonists in human urine was developed and validated. The method covers the therapeutic drugs most commonly administered, but also potentially abused β 2 -agonists. The procedure is based on enzymatic deconjugation with β-glucuronidase followed by SPE clean up using mixed-phase cartridges with both ion-exchange and lipophilic properties. Instrumental analysis conducted by UHPLC-MS/MS allowed high peak resolution and rapid chromatographic separation, with reduced time and costs. The method was fully validated according ISO 17025:2005 principles. The following parameters were determined for each analyte: specificity, selectivity, linearity, limit of detection, limit of quantification, precision, accuracy, matrix effect, recovery and carry-over. The method was tested on real samples obtained from patients subjected to clinical treatment under chronic or acute therapy with either formoterol, indacaterol, salbutamol, or salmeterol. The drugs were administered using pressurized metered dose inhalers. All β 2 -agonists administered to the patients were detected in the real samples. The method proved adequate to accurately measure the concentration of these analytes in the real samples. The observed analytical data are discussed with reference to the administered dose and the duration of the therapy. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  9. Validation of the Classroom Behavior Inventory

    ERIC Educational Resources Information Center

    Blunden, Dale; And Others

    1974-01-01

    Factor-analytic methods were used toassess contruct validity of the Classroom Behavior Inventory, a scale for rating behaviors associated with hyperactivity. The Classroom Behavior Inventory measures three dimensions of behavior: Hyperactivity, Hostility, and Sociability. Significant concurrent validity was obtained for only one Classroom Behavior…

  10. Control Chart on Semi Analytical Weighting

    NASA Astrophysics Data System (ADS)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  11. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  12. Development of a solvent-free analytical method for paracetamol quantitative determination in Blood Brain Barrier in vitro model.

    PubMed

    Langlois, Marie-Hélène; Vekris, Antonios; Bousses, Christine; Mordelet, Elodie; Buhannic, Nathalie; Séguard, Céline; Couraud, Pierre-Olivier; Weksler, Babette B; Petry, Klaus G; Gaudin, Karen

    2015-04-15

    A Reversed Phase-High Performance Liquid Chromatography/Diode Array Detection method was developed and validated for paracetamol quantification in cell culture fluid from an in vitro Blood Brain Barrier model. The chromatographic method and sample preparation were developed using only aqueous solvents. The column was a XTerra RP18 150 × 4.6mm, 3.5 μm with a guard column XTerra RP18 20 × 4.6 mm, 3.5 μm at 35 °C and the mobile phase was composed by 100% formate buffer 20 mM at pH 4 and flow rate was set at 1 mL/min. The detection was at 242 nm. The sample was injected at 10 μL. Validation was performed using the accuracy profile approach. The analytical procedure was validated with the acceptance limits at ± 10% over a range of concentration from 1 to 58 mg L(-1). The procedure was then used in routine to determine paracetamol concentration in a brain blood barrier in vitro model. Application of the Unither paracetamol formulation in Blood Brain Barrier model allowed the determination and comparison of the transcellular passage of paracetamol at 37 °C and 4 °C, that excludes paracellular or non specific leakage. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Comparison of analytical and predictive methods for water, protein, fat, sugar, and gross energy in marine mammal milk.

    PubMed

    Oftedal, O T; Eisert, R; Barrell, G K

    2014-01-01

    Mammalian milks may differ greatly in composition from cow milk, and these differences may affect the performance of analytical methods. High-fat, high-protein milks with a preponderance of oligosaccharides, such as those produced by many marine mammals, present a particular challenge. We compared the performance of several methods against reference procedures using Weddell seal (Leptonychotes weddellii) milk of highly varied composition (by reference methods: 27-63% water, 24-62% fat, 8-12% crude protein, 0.5-1.8% sugar). A microdrying step preparatory to carbon-hydrogen-nitrogen (CHN) gas analysis slightly underestimated water content and had a higher repeatability relative standard deviation (RSDr) than did reference oven drying at 100°C. Compared with a reference macro-Kjeldahl protein procedure, the CHN (or Dumas) combustion method had a somewhat higher RSDr (1.56 vs. 0.60%) but correlation between methods was high (0.992), means were not different (CHN: 17.2±0.46% dry matter basis; Kjeldahl 17.3±0.49% dry matter basis), there were no significant proportional or constant errors, and predictive performance was high. A carbon stoichiometric procedure based on CHN analysis failed to adequately predict fat (reference: Röse-Gottlieb method) or total sugar (reference: phenol-sulfuric acid method). Gross energy content, calculated from energetic factors and results from reference methods for fat, protein, and total sugar, accurately predicted gross energy as measured by bomb calorimetry. We conclude that the CHN (Dumas) combustion method and calculation of gross energy are acceptable analytical approaches for marine mammal milk, but fat and sugar require separate analysis by appropriate analytic methods and cannot be adequately estimated by carbon stoichiometry. Some other alternative methods-low-temperature drying for water determination; Bradford, Lowry, and biuret methods for protein; the Folch and the Bligh and Dyer methods for fat; and enzymatic and reducing

  14. 21 CFR 1270.31 - Written procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... procedures prepared and followed for all significant steps in the infectious disease testing process under... procedures prepared, validated, and followed for prevention of infectious disease contamination or cross...

  15. 21 CFR 1270.31 - Written procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... procedures prepared and followed for all significant steps in the infectious disease testing process under... procedures prepared, validated, and followed for prevention of infectious disease contamination or cross...

  16. 21 CFR 1270.31 - Written procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... procedures prepared and followed for all significant steps in the infectious disease testing process under... procedures prepared, validated, and followed for prevention of infectious disease contamination or cross...

  17. 21 CFR 1270.31 - Written procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... procedures prepared and followed for all significant steps in the infectious disease testing process under... procedures prepared, validated, and followed for prevention of infectious disease contamination or cross...

  18. A factor analytic investigation of the Mercy Evaluation of Multiple Sclerosis.

    PubMed

    Merz, Zachary C; Wright, John D; Vander Wal, Jillon S; Gfeller, Jeffrey D

    2018-01-23

    Neurocognitive deficits commonly are an accompanying feature of Multiple Sclerosis (MS). A brief, yet comprehensive neuropsychological battery is desirable for assessing the extent of these deficits. Therefore, the present study examined the validity of the Mercy Evaluation of Multiple Sclerosis (MEMS) for use with the MS population. Archival data from individuals diagnosed with MS (N = 378) by independent neurologists was examined. Cognitive domains assessed included processing speed and attention, learning, and memory, visuospatial, language, and executive functioning. A mean battery index was calculated to provide a general indicator of cognitive impairment within the current sample. Overall performance across participants was found to be in the lower limits of the average range. Results of factor analytic statistical procedures yielded a four-factor solution, accounting for 67% of total variance within the MEMS. Four neurocognitive measures exhibited the highest sensitivity in detecting cognitive impairment, constituting a psychometrically established brief cognitive screening battery, which accounted for 83% of total variance within the mean battery index score. Overall, the results of the current study suggest appropriate construct validity of the MEMS for use with individuals with MS, as well as provide support for previously established cognitive batteries.

  19. Validity Studies of the Filial Anxiety Scale.

    ERIC Educational Resources Information Center

    Murray, Paul D.; And Others

    1996-01-01

    Factor analytic and construct validity studies were conducted to explore the validity of Cicirelli's 13-item Filial Anxiety Scale (FAS). The State-Trait Anxiety Inventory and the Marlowe-Crowne Social Desirability Scale were a part of the investigation. Results offer support for the validity of the FAS subscales and the FAS' usefulness as an…

  20. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1991-01-01

    Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations.

  1. Analytical procedure for the determination of very volatile organic compounds (C3-C6) in indoor air.

    PubMed

    Schieweck, Alexandra; Gunschera, Jan; Varol, Deniz; Salthammer, Tunga

    2018-05-01

    The substance group of very volatile organic compounds (VVOCs) is moving into the focus of indoor air analysis, facing ongoing regulations at international and European levels targeting on indoor air quality and human health. However, there exists at present no validated analysis for the identification and quantification of VVOCs in indoor air. Therefore, the present study targeted on the development of an analytical method in order to sample the maximum possible quantity of VVOCs in indoor air on solid sorbents with subsequent analysis by thermal desorption and coupled gas chromatography/mass spectrometry (TDS-GC/MS). For this purpose, it was necessary to investigate the performance of available sorbents and to optimize the parameters of GC/MS analysis. Stainless steel tubes filled with Carbograph 5TD were applied successfully for low-volume sampling (2-4 l) with minimal breakthrough (< 1%). With the developed method, VVOCs between C 3 and C 6 of different volatility and polarity  can be detected even in trace quantities with low limits of quantitation (LOQ; 1-3 μg m -3 ). Limitations occur for low molecular weight compounds ≤C 3 , especially for polar substances, such as carboxylic acids and for some aldehydes and alcohols. Consequently, established methods for the quantification of these compounds in indoor air cannot be fully substituted yet. At least three different analytical techniques are needed to cover the large spectrum of relevant VVOCs in indoor air. In addition, unexpected reaction products might occur and need to be taken into account to avoid misinterpretation of chromatographic signals. Graphical abstract Solid sorbent sampling of VVOCs (C 3 -C 6 ) in indoor air with subsequent TDS-GC/MS analysis.

  2. Initial interlaboratory validation of an analytical method for the determination of lead in canned tuna to be used for monitoring and regulatory purposes.

    PubMed

    Santiago, E C; Bello, F B B

    2003-06-01

    The Association of Official Analytical Chemists (AOAC) Standard Method 972.23 (dry ashing and flame atomic absorption spectrophotometry (FAAS)), applied to the analysis of lead in tuna, was validated in three selected local laboratories to determine the acceptability of the method to both the Codex Alimentarius Commission (Codex) and the European Union (EU) Commission for monitoring lead in canned tuna. Initial validation showed that the standard AOAC method as performed in the three participating laboratories cannot satisfy the Codex/EU proposed criteria for the method detection limit for monitoring lead in fish at the present regulation level of 0.5 mg x kg(-1). Modification of the standard method by chelation/concentration of the digest solution before FAAS analysis showed that the modified method has the potential to meet Codex/EU criteria on sensitivity, accuracy and precision at the specified regulation level.

  3. Separation, identification, quantification, and method validation of anthocyanins in botanical supplement raw materials by HPLC and HPLC-MS.

    PubMed

    Chandra, A; Rana, J; Li, Y

    2001-08-01

    A method has been established and validated for identification and quantification of individual, as well as total, anthocyanins by HPLC and LC/ES-MS in botanical raw materials used in the herbal supplement industry. The anthocyanins were separated and identified on the basis of their respective M(+) (cation) using LC/ES-MS. Separated anthocyanins were individually calculated against one commercially available anthocyanin external standard (cyanidin-3-glucoside chloride) and expressed as its equivalents. Amounts of each anthocyanin calculated as external standard equivalent were then multiplied by a molecular-weight correction factor to afford their specific quantities. Experimental procedures and use of a molecular-weight correction factors are substantiated and validated using Balaton tart cherry and elderberry as templates. Cyanidin-3-glucoside chloride has been widely used in the botanical industry to calculate total anthocyanins. In our studies on tart cherry and elderberry, its use as external standard followed by use of molecular-weight correction factors should provide relatively accurate results for total anthocyanins, because of the presence of cyanidin as their major anthocyanidin backbone. The method proposed here is simple and has a direct sample preparation procedure without any solid-phase extraction. It enables selection and use of commercially available anthocyanins as external standards for quantification of specific anthocyanins in the sample matrix irrespective of their commercial availability as analytical standards. It can be used as a template and applied for similar quantification in several anthocyanin-containing raw materials for routine quality control procedures, thus providing consistency in analytical testing of botanical raw materials used for manufacturing efficacious and true-to-the-label nutritional supplements.

  4. Raman spectroscopy for the analytical quality control of low-dose break-scored tablets.

    PubMed

    Gómez, Diego A; Coello, Jordi; Maspoch, Santiago

    2016-05-30

    Quality control of solid dosage forms involves the analysis of end products according to well-defined criteria, including the assessment of the uniformity of dosage units (UDU). However, in the case of break-scored tablets, given that tablet splitting is widespread as a means to adjust doses, the uniform distribution of the active pharmaceutical ingredient (API) in all the possible fractions of the tablet must be assessed. A general procedure to accomplish with both issues, using Raman spectroscopy, is presented. It is based on the acquisition of a collection of spectra in different regions of the tablet, that later can be selected to determine the amount of API in the potential fractions that can result after splitting. The procedure has been applied to two commercial products, Sintrom 1 and Sintrom 4, with API (acenocoumarol) mass proportion of 2% and 0.7% respectively. Partial Least Squares (PLS) calibration models were constructed for the quantification of acenocoumarol in whole tablets using HPLC as a reference analytical method. Once validated, the calibration models were used to determine the API content in the different potential fragments of the scored Sintrom 4 tablets. Fragment mass measurements were also performed to estimate the range of masses of the halves and quarters that could result after tablet splitting. The results show that Raman spectroscopy can be an alternative analytical procedure to assess the uniformity of content, both in whole tablets as in its potential fragments, and that Sintrom 4 tablets can be perfectly split in halves, but some cautions have to be taken when considering the fragmentation in quarters. A practical alternative to the use of UDU test for the assessment of tablet fragments is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Systematic investigation of ion suppression and enhancement effects of fourteen stable-isotope-labeled internal standards by their native analogues using atmospheric-pressure chemical ionization and electrospray ionization and the relevance for multi-analyte liquid chromatographic/mass spectrometric procedures.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Meyer, Markus R; Maurer, Hans H

    2010-04-15

    In clinical and forensic toxicology, multi-analyte procedures are very useful to quantify drugs and poisons of different classes in one run. For liquid chromatographic/tandem mass spectrometric (LC/MS/MS) multi-analyte procedures, often only a limited number of stable-isotope-labeled internal standards (SIL-ISs) are available. If an SIL-IS is used for quantification of other analytes, it must be excluded that the co-eluting native analyte influences its ionization. Therefore, the effect of ion suppression and enhancement of fourteen SIL-ISs caused by their native analogues has been studied. It could be shown that the native analyte concentration influenced the extent of ion suppression and enhancement effects leading to more suppression with increasing analyte concentration especially when electrospray ionization (ESI) was used. Using atmospheric-pressure chemical ionization (APCI), methanolic solution showed mainly enhancement effects, whereas no ion suppression and enhancement effect, with one exception, occurred when plasma extracts were used under these conditions. Such differences were not observed using ESI. With ESI, eleven SIL-ISs showed relevant suppression effects, but only one analyte showed suppression effects when APCI was used. The presented study showed that ion suppression and enhancement tests using matrix-based samples of different sources are essential for the selection of ISs, particularly if used for several analytes to avoid incorrect quantification. In conclusion, only SIL-ISs should be selected for which no suppression and enhancement effects can be observed. If not enough ISs are free of ionization interferences, a different ionization technique should be considered. 2010 John Wiley & Sons, Ltd.

  6. Validation of a Career Information Retrieval Procedure.

    ERIC Educational Resources Information Center

    Kenoyer, Charles E.

    Research examined the relative desirability of categories of information supplied by a keysort search procedure, as judged by users. The Career Information System was used as the framework for obtaining occupational information organized around the concept of the Worker Trait Group (WTG). Four groups of students in each of grades 9 through 12…

  7. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  8. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  9. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  10. Validation of QuEChERS analytical technique for organochlorines and synthetic pyrethroids in fruits and vegetables using GC-ECD.

    PubMed

    Dubey, J K; Patyal, S K; Sharma, Ajay

    2018-03-19

    In the present day scenario of increasing awareness and concern about the pesticides, it is very important to ensure the quality of data being generated in pesticide residue analysis. To impart confidence in the products, terms like quality assurance and quality control are used as an integral part of quality management. In order to ensure better quality of results in pesticide residue analysis, validation of analytical methods to be used is extremely important. Keeping in view the importance of validation of method, the validation of QuEChERS (quick, easy, cheap, effective, rugged, and safe) a multiresidue method for extraction of 13 organochlorines and seven synthetic pyrethroids in fruits and vegetables followed by GC ECD for quantification was done so as to use this method for analysis of samples received in the laboratory. The method has been validated as per the Guidelines issued by SANCO (French words Sante for Health and Consommateurs for Consumers) in accordance with their document SANCO/XXXX/2013. Various parameters analyzed, viz., linearity, specificity, repeatability, reproducibility, and ruggedness were found to have acceptable values with a per cent RSD of less than 10%. Limit of quantification (LOQ) for the organochlorines was established to be 0.01 and 0.05 mg kg -1 for the synthetic pyrethroids. The uncertainty of the measurement (MU) for all these compounds ranged between 1 and 10%. The matrix-match calibration was used to compensate the matrix effect on the quantification of the compounds. The overall recovery of the method ranged between 80 and 120%. These results demonstrate the applicability and acceptability of this method in routine estimation of pesticide residues of these 20 pesticides in the fruits and vegetables by the laboratory.

  11. Sensitivity of fish density estimates to standard analytical procedures applied to Great Lakes hydroacoustic data

    USGS Publications Warehouse

    Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.

    2013-01-01

    Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...

  13. Analytical model for screening potential CO2 repositories

    USGS Publications Warehouse

    Okwen, R.T.; Stewart, M.T.; Cunningham, J.A.

    2011-01-01

    Assessing potential repositories for geologic sequestration of carbon dioxide using numerical models can be complicated, costly, and time-consuming, especially when faced with the challenge of selecting a repository from a multitude of potential repositories. This paper presents a set of simple analytical equations (model), based on the work of previous researchers, that could be used to evaluate the suitability of candidate repositories for subsurface sequestration of carbon dioxide. We considered the injection of carbon dioxide at a constant rate into a confined saline aquifer via a fully perforated vertical injection well. The validity of the analytical model was assessed via comparison with the TOUGH2 numerical model. The metrics used in comparing the two models include (1) spatial variations in formation pressure and (2) vertically integrated brine saturation profile. The analytical model and TOUGH2 show excellent agreement in their results when similar input conditions and assumptions are applied in both. The analytical model neglects capillary pressure and the pressure dependence of fluid properties. However, simulations in TOUGH2 indicate that little error is introduced by these simplifications. Sensitivity studies indicate that the agreement between the analytical model and TOUGH2 depends strongly on (1) the residual brine saturation, (2) the difference in density between carbon dioxide and resident brine (buoyancy), and (3) the relationship between relative permeability and brine saturation. The results achieved suggest that the analytical model is valid when the relationship between relative permeability and brine saturation is linear or quasi-linear and when the irreducible saturation of brine is zero or very small. ?? 2011 Springer Science+Business Media B.V.

  14. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry

    ERIC Educational Resources Information Center

    Adami, Gianpiero

    2006-01-01

    A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…

  15. Analytical solutions for tomato peeling with combined heat flux and convective boundary conditions

    NASA Astrophysics Data System (ADS)

    Cuccurullo, G.; Giordano, L.; Metallo, A.

    2017-11-01

    Peeling of tomatoes by radiative heating is a valid alternative to steam or lye, which are expensive and pollutant methods. Suitable energy densities are required in order to realize short time operations, thus involving only a thin layer under the tomato surface. This paper aims to predict the temperature field in rotating tomatoes exposed to the source irradiation. Therefore, a 1D unsteady analytical model is presented, which involves a semi-infinite slab subjected to time dependent heating while convective heat transfer takes place on the exposed surface. In order to account for the tomato rotation, the heat source is described as the positive half-wave of a sinusoidal function. The problem being linear, the solution is derived following the Laplace Transform Method. In addition, an easy-to-handle solution for the problem at hand is presented, which assumes a differentiable function for approximating the source while neglecting convective cooling, the latter contribution turning out to be negligible for the context at hand. A satisfying agreement between the two analytical solutions is found, therefore, an easy procedure for a proper design of the dry heating system can be set up avoiding the use of numerical simulations.

  16. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    PubMed

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  17. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification

    PubMed Central

    Baczyńska, Anna K.; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach’s alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed. PMID:27014111

  18. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  19. Analyticity and Features of Semantic Interaction.

    ERIC Educational Resources Information Center

    Steinberg, Danny D.

    The findings reported in this paper are the result of an experiment to determine the empirical validity of such semantic concepts as analytic, synthetic, and contradictory. Twenty-eight university students were presented with 156 sentences to assign to one of four semantic categories: (1) synthetic ("The dog is a poodle"), (2) analytic…

  20. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov Websites

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination different hydroxyl groups (-OH) in pyrolysis bio-oil: aliphatic-OH, phenolic-OH, and carboxylic-OH. Download

  1. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    PubMed

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Validation of Mission Plans Through Simulation

    NASA Astrophysics Data System (ADS)

    St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.

    2002-01-01

    The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM

  3. Procedural validity of the AUDADIS-5 depression, anxiety and post-traumatic stress disorder modules: substance abusers and others in the general population*

    PubMed Central

    Hasin, Deborah S.; Shmulewitz, Dvora; Stohl, Malka; Greenstein, Eliana; Aivadyan, Christina; Morita, Kara; Saha, Tulshi; Aharonovich, Efrat; Jung, Jeesun; Zhang, Haitao; Nunes, Edward V.; Grant, Bridget F.

    2016-01-01

    Background Little is known about the procedural validity of lay-administered, fully-structured assessments of depressive, anxiety and post-traumatic stress (PTSD) disorders in the general population as determined by comparison to clinical re-appraisal, and whether this differs between current regular substance abusers and others. We evaluated the procedural validity of the Alcohol Use Disorder and Associated Disabilities Interview Schedule, DSM-5 Version (AUDADIS-5) assessment of these disorders through clinician re-interviews. Methods Test-retest design among respondents from the National Epidemiologic Survey on Alcohol and Related Conditions-III (NESARC-III): (264 current regular substance abusers, 447 others). Clinicians blinded to AUDADIS-5 results administered the semi-structured Psychiatric Research Interview for Substance and Mental Disorders, DSM-5 version (PRISM-5). AUDADIS-5/PRISM-5 concordance was indicated by kappa (κ) for diagnoses and intraclass correlation coefficients (ICC) for dimensional measures (DSM-5 symptom or criterion counts). Results were compared between current regular substance abusers and others. Results AUDADIS-5 and PRISM-5 concordance for DSM-5 depressive disorders, anxiety disorders and PTSD was generally fair to moderate (κ =0.24–0.59), with concordance on dimensional scales much better (ICC=0.53–0.81). Concordance differed little between regular substance abusers and others. Conclusions AUDADIS-5/PRISM-5 concordance indicated procedural validity for the AUDADIS-5 among substance abusers and others, suggesting that AUDADIS-5 diagnoses of DSM-5 depressive, anxiety and PTSD diagnoses are informative measures in both groups in epidemiologic studies. The stronger concordance on dimensional measures supports the current movement towards dimensional psychopathology measures, suggesting that such measures provide important information for research in the NESARC-III and other datasets, and possibly for clinical purposes as well. PMID

  4. Interlaboratory study for the validation of an ecotoxicological procedure to monitor the quality of septic sludge received at a wastewater treatment plant.

    PubMed

    Robidoux, P Y; Choucri, A; Bastien, C; Sunahara, G I; López-Gastey, J

    2001-01-01

    Septic tank sludge is regularly hauled to the Montreal Urban Community (MUC) wastewater treatment plant. It is then discharged and mixed with the wastewater inflow before entering the primary chemical treatment process. An ecotoxicological procedure integrating chemical and toxicological analyses has been recently developed and applied to screen for the illicit discharge of toxic substances in septic sludge. The toxicity tests used were the Microtox, the bacterial-respiration, and the lettuce (Lactuca sativa) root elongation tests. In order to validate the applicability of the proposed procedure, a two-year interlaboratory study was carried out. In general, the results obtained by two independent laboratories (MUC and the Centre d'expertise en analyse environnementale du Quebec) were comparable and reproducible. Some differences were found using the Microtox test. Organic (e.g., phenol and formaldehyde) and inorganic (e.g., nickel and cyanide) spiked septic sludge were detected with good reliability and high efficiency. The relative efficiency to detect spiked substances was > 70% and confirms the results of previous studies. In addition, the respiration test was the most efficient toxicological tool to detect spiked substances, whereas the Microtox was the least efficient (< 15%). Efficiencies to detect spiked contaminants were also similar for both laboratories. These results support previous data presented earlier and contribute to the validation of the ecotoxicological procedure used by the MUC to screen toxicity in septic sludge.

  5. ANALYTICAL PROCEDURES FOR CHARACTERIZING UNREGULATED EMISSIONS FROM VEHICLES USING MIDDLE-DISTILLATE FUELS

    EPA Science Inventory

    This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...

  6. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional

  7. Quantitative detection of caffeine in human skin by confocal Raman spectroscopy--A systematic in vitro validation study.

    PubMed

    Franzen, Lutz; Anderski, Juliane; Windbergs, Maike

    2015-09-01

    For rational development and evaluation of dermal drug delivery, the knowledge of rate and extent of substance penetration into the human skin is essential. However, current analytical procedures are destructive, labor intense and lack a defined spatial resolution. In this context, confocal Raman microscopy bares the potential to overcome current limitations in drug depth profiling. Confocal Raman microscopy already proved its suitability for the acquisition of qualitative penetration profiles, but a comprehensive investigation regarding its suitability for quantitative measurements inside the human skin is still missing. In this work, we present a systematic validation study to deploy confocal Raman microscopy for quantitative drug depth profiling in human skin. After we validated our Raman microscopic setup, we successfully established an experimental procedure that allows correlating the Raman signal of a model drug with its controlled concentration in human skin. To overcome current drawbacks in drug depth profiling, we evaluated different modes of peak correlation for quantitative Raman measurements and offer a suitable operating procedure for quantitative drug depth profiling in human skin. In conclusion, we successfully demonstrate the potential of confocal Raman microscopy for quantitative drug depth profiling in human skin as valuable alternative to destructive state-of-the-art techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    PubMed

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  9. FDA Bacteriological Analytical Manual, Chapter 10, 2003: Listeria monocytogenes

    EPA Pesticide Factsheets

    FDA Bacteriological Analytical Manual, Chapter 10 describes procedures for analysis of food samples and may be adapted for assessment of solid, particulate, aerosol, liquid and water samples containing Listeria monocytogenes.

  10. Niosh analytical methods for Set G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-12-01

    Industrial Hygiene sampling and analytical monitoring methods validated under the joint NIOSH/OSHA Standards Completion Program for Set G are contained herein. Monitoring methods for the following compounds are included: butadiene, heptane, ketene, methyl cyclohexane, octachloronaphthalene, pentachloronaphthalene, petroleum distillates, propylene dichloride, turpentine, dioxane, hexane, LPG, naphtha(coal tar), octane, pentane, propane, and stoddard solvent.

  11. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Validity of temporomandibular disorder examination procedures for assessment of temporomandibular joint status.

    PubMed

    Schmitter, Marc; Kress, Bodo; Leckel, Michael; Henschel, Volkmar; Ohlmann, Brigitte; Rammelsberg, Peter

    2008-06-01

    This hypothesis-generating study was performed to determine which items in the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) and additional diagnostic tests have the best predictive accuracy for joint-related diagnoses. One hundred forty-nine TMD patients and 43 symptom-free subjects were examined in clinical examinations and with magnetic resonance imaging (MRI). The importance of each variable of the clinical examination for correct joint-related diagnosis was assessed by using MRI diagnoses. For this purpose, "random forest" statistical software (based on classification trees) was used. Maximum unassisted jaw opening, maximum assisted jaw opening, history of locked jaw, joint sound with and without compression, joint pain, facial pain, pain on palpation of the lateral pterygoid area, and overjet proved suitable for distinguishing between subtypes of joint-related TMD. Measurement of excursion, protrusion, and midline deviation were less important. The validity of clinical TMD examination procedures can be enhanced by using the 16 variables of greatest importance identified in this study. In addition to other variables, maximum unassisted and assisted opening and a history of locked jaw were important when assessing the status of the TMJ.

  13. Determining passive cooling limits in CPV using an analytical thermal model

    NASA Astrophysics Data System (ADS)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  14. Analytical and Clinical Validation of a Digital Sequencing Panel for Quantitative, Highly Accurate Evaluation of Cell-Free Circulating Tumor DNA

    PubMed Central

    Zill, Oliver A.; Sebisanovic, Dragan; Lopez, Rene; Blau, Sibel; Collisson, Eric A.; Divers, Stephen G.; Hoon, Dave S. B.; Kopetz, E. Scott; Lee, Jeeyun; Nikolinakos, Petros G.; Baca, Arthur M.; Kermani, Bahram G.; Eltoukhy, Helmy; Talasaz, AmirAli

    2015-01-01

    Next-generation sequencing of cell-free circulating solid tumor DNA addresses two challenges in contemporary cancer care. First this method of massively parallel and deep sequencing enables assessment of a comprehensive panel of genomic targets from a single sample, and second, it obviates the need for repeat invasive tissue biopsies. Digital SequencingTM is a novel method for high-quality sequencing of circulating tumor DNA simultaneously across a comprehensive panel of over 50 cancer-related genes with a simple blood test. Here we report the analytic and clinical validation of the gene panel. Analytic sensitivity down to 0.1% mutant allele fraction is demonstrated via serial dilution studies of known samples. Near-perfect analytic specificity (> 99.9999%) enables complete coverage of many genes without the false positives typically seen with traditional sequencing assays at mutant allele frequencies or fractions below 5%. We compared digital sequencing of plasma-derived cell-free DNA to tissue-based sequencing on 165 consecutive matched samples from five outside centers in patients with stage III-IV solid tumor cancers. Clinical sensitivity of plasma-derived NGS was 85.0%, comparable to 80.7% sensitivity for tissue. The assay success rate on 1,000 consecutive samples in clinical practice was 99.8%. Digital sequencing of plasma-derived DNA is indicated in advanced cancer patients to prevent repeated invasive biopsies when the initial biopsy is inadequate, unobtainable for genomic testing, or uninformative, or when the patient’s cancer has progressed despite treatment. Its clinical utility is derived from reduction in the costs, complications and delays associated with invasive tissue biopsies for genomic testing. PMID:26474073

  15. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    PubMed

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  16. A Meta-Analytic Investigation of Fiedler's Contingency Model of Leadership Effectiveness.

    ERIC Educational Resources Information Center

    Strube, Michael J.; Garcia, Joseph E.

    According to Fiedler's Contingency Model of Leadership Effectiveness, group performance is a function of the leader-situation interaction. A review of past validations has found several problems associated with the model. Meta-analytic techniques were applied to the Contingency Model in order to assess the validation evidence quantitatively. The…

  17. An interlaboratory transfer of a multi-analyte assay between continents.

    PubMed

    Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew

    2015-01-01

    Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.

  18. Using Analytic Hierarchy Process in Textbook Evaluation

    ERIC Educational Resources Information Center

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  19. Validity Issues in Clinical Assessment.

    ERIC Educational Resources Information Center

    Foster, Sharon L.; Cone, John D.

    1995-01-01

    Validation issues that arise with measures of constructs and behavior are addressed with reference to general reasons for using assessment procedures in clinical psychology. A distinction is made between the representational phase of validity assessment and the elaborative validity phase in which the meaning and utility of scores are examined.…

  20. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    PubMed

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  2. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  3. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  4. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  5. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  6. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  7. Analytic H I-to-H2 Photodissociation Transition Profiles

    NASA Astrophysics Data System (ADS)

    Bialy, Shmuel; Sternberg, Amiel

    2016-05-01

    We present a simple analytic procedure for generating atomic (H I) to molecular ({{{H}}}2) density profiles for optically thick hydrogen gas clouds illuminated by far-ultraviolet radiation fields. Our procedure is based on the analytic theory for the structure of one-dimensional H I/{{{H}}}2 photon-dominated regions, presented by Sternberg et al. Depth-dependent atomic and molecular density fractions may be computed for arbitrary gas density, far-ultraviolet field intensity, and the metallicity-dependent H2 formation rate coefficient, and dust absorption cross section in the Lyman-Werner photodissociation band. We use our procedure to generate a set of {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition profiles for a wide range of conditions, from the weak- to strong-field limits, and from super-solar down to low metallicities. We show that if presented as functions of dust optical depth, the {{H}} {{I}} and {{{H}}}2 density profiles depend primarily on the Sternberg “α G parameter” (dimensionless) that determines the dust optical depth associated with the total photodissociated {{H}} {{I}} column. We derive a universal analytic formula for the {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition points as a function of just α G. Our formula will be useful for interpreting emission-line observations of H I/{{{H}}}2 interfaces, for estimating star formation thresholds, and for sub-grid components in hydrodynamics simulations.

  8. Validity and Measurement

    ERIC Educational Resources Information Center

    Maraun, Michael D.

    2012-01-01

    As illuminated forcefully by Professor Newton's provocative analytical and historical excursion, as long as tests are employed to practical ends (prediction, selection, etc.) there is little cause for the metatheoretic angst that occasions rounds of papers on the topic of validity. But then, also, there seems little need, within this context of…

  9. Low/Medium Volatile Data Validation

    EPA Pesticide Factsheets

    Document designed to offer data reviewers guidance in determining the validity of analytical data generated through the US EPA Contract Laboratory Program Statement of Work ISM01.X Inorganic Superfund Methods (Multi-Media, Multi-Concentration)

  10. Do placebo based validation standards mimic real batch products behaviour? Case studies.

    PubMed

    Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E

    2011-06-01

    Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. The Case for Adopting Server-side Analytics

    NASA Astrophysics Data System (ADS)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  12. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    PubMed

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  13. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    PubMed

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  14. Analytical Modeling of Groundwater Seepages to St. Lucie Estuary

    NASA Astrophysics Data System (ADS)

    Lee, J.; Yeh, G.; Hu, G.

    2008-12-01

    In this paper, six analytical models describing hydraulic interaction of stream-aquifer systems were applied to St Lucie Estuary (SLE) River Estuaries. These are analytical solutions for: (1) flow from a finite aquifer to a canal, (2) flow from an infinite aquifer to a canal, (3) the linearized Laplace system in a seepage surface, (4) wave propagation in the aquifer, (5) potential flow through stratified unconfined aquifers, and (6) flow through stratified confined aquifers. Input data for analytical solutions were obtained from monitoring wells and river stages at seepage-meter sites. Four transects in the study area are available: Club Med, Harbour Ridge, Lutz/MacMillan, and Pendarvis Cove located in the St. Lucie River. The analytical models were first calibrated with seepage meter measurements and then used to estimate of groundwater discharges into St. Lucie River. From this process, analytical relationships between the seepage rate and river stages and/or groundwater tables were established to predict the seasonal and monthly variation in groundwater seepage into SLE. It was found the seepage rate estimations by analytical models agreed well with measured data for some cases but only fair for some other cases. This is not unexpected because analytical solutions have some inherently simplified assumptions, which may be more valid for some cases than the others. From analytical calculations, it is possible to predict approximate seepage rates in the study domain when the assumptions underlying these analytical models are valid. The finite and infinite aquifer models and the linearized Laplace method are good for sites Pendarvis Cove and Lutz/MacMillian, but fair for the other two sites. The wave propagation model gave very good agreement in phase but only fairly agreement in magnitude for all four sites. The stratified unconfined and confined aquifer models gave similarly good agreements with measurements at three sites but poorly at the Club Med site. None of

  15. Quantitative determination and sampling of azathioprine residues for cleaning validation in production area.

    PubMed

    Fazio, Tatiana Tatit; Singh, Anil Kumar; Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Inês Rocha Miritello

    2007-03-12

    Cleaning validation is an integral part of current good manufacturing practices in any pharmaceutical industry. Nowadays, azathioprine and several other pharmacologically potent pharmaceuticals are manufactured in same production area. Carefully designed cleaning validation and its evaluation can ensure that residues of azathioprine will not carry over and cross contaminate the subsequent product. The aim of this study was to validate simple analytical method for verification of residual azathioprine in equipments used in the production area and to confirm efficiency of cleaning procedure. The HPLC method was validated on a LC system using Nova-Pak C18 (3.9 mm x 150 mm, 4 microm) and methanol-water-acetic acid (20:80:1, v/v/v) as mobile phase at a flow rate of 1.0 mL min(-1). UV detection was made at 280 nm. The calibration curve was linear over a concentration range from 2.0 to 22.0 microg mL(-1) with a correlation coefficient of 0.9998. The detection limit (DL) and quantitation limit (QL) were 0.09 and 0.29 microg mL(-1), respectively. The intra-day and inter-day precision expressed as relative standard deviation (R.S.D.) were below 2.0%. The mean recovery of method was 99.19%. The mean extraction-recovery from manufacturing equipments was 83.5%. The developed UV spectrophotometric method could only be used as limit method to qualify or reject cleaning procedure in production area. Nevertheless, the simplicity of spectrophotometric method makes it useful for routine analysis of azathioprine residues on cleaned surface and as an alternative to proposed HPLC method.

  16. Analytical validation of viral CNS Flow Chip kit for detection of acute meningitis and encephalitis.

    PubMed

    Pérez-Ruiz, Mercedes; Pedrosa-Corral, Irene; Sanbonmatsu-Gámez, Sara; Gómez-Camarasa, Cristina; Navarro-Marí, José María

    2018-06-12

    A new molecular assay (Viral CNS Flow Chip kit, Master Diagnóstica, Spain) has been developed for the detection of eight viruses causing acute meningitis and encephalitis, i.e. herpes simplex viruses 1-2, varicella zoster virus, human enterovirus, human parechovirus, Toscana virus, human cytomegalovirus and Epstein Barr virus. The new assay is a multiplex one-step RT-PCR followed by automatic flow-through hybridization, colorimetric detection and image analysis. The limit of detection was 50 copies/reaction, and 10 copies/reaction for human enterovirus and the other seven viruses, respectively. The analytical validation was performed with nucleic acids extracted from 268 cerebrospinal fluid samples and the results were compared with routine molecular assays. An excellent coefficient of agreement was observed between V-CNS and routine assays [kappa index: 0.948 (95%CI: 0.928-0.968)]. The overall sensitivity and specificity was 95.9% (95%CI: 91.2-98.3%) and 99.9% (95%CI: 99.6-100%), respectively. Viral CNS Flow Chip kit is an efficient multiplex platform for the detection of the main viruses involved in acute meningitis and encephalitis. The inclusion of a TOSV genome target may improve the laboratory diagnosis of viral neurological infections in endemic areas. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we

  18. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  19. Validation of Procedures for Monitoring Crewmember Immune Function SDBI-1900, SMO-015 - Integrated Immune

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Stowe, Raymond; Mehta, Satish; Uchakin, Peter; Nehlsen-Cannarella, Sandra; Morukov, Boris; Pierson, Duane; Sams, Clarence

    2007-01-01

    There is ample evidence to suggest that space flight leads to immune system dysregulation. This may be a result of microgravity, confinement, physiological stress, radiation, environment or other mission-associated factors. The clinical risk from prolonged immune dysregulation during space flight are not yet determined, but may include increased incidence of infection, allergy, hypersensitivity, hematological malignancy or altered wound healing. Each of the clinical events resulting from immune dysfunction has the potential to impact mission critical objectives during exploration-class missions. To date, precious little in-flight immune data has been generated to assess this phenomenon. The majority of recent flight immune studies have been post-flight assessments, which may not accurately reflect the in-flight condition. There are no procedures currently in place to monitor immune function or its effect on crew health. The objective of this Supplemental Medical Objective (SMO) is to develop and validate an immune monitoring strategy consistent with operational flight requirements and constraints. This SMO will assess the clinical risks resulting from the adverse effects of space flight on the human immune system and will validate a flight-compatible immune monitoring strategy. Characterization of the clinical risk and the development of a monitoring strategy are necessary prerequisite activities prior to validating countermeasures. This study will determine, to the best level allowed by current technology, the in-flight status of crewmembers immune system. Pre-flight, in-flight and post-flight assessments of immune status, immune function, viral reactivation and physiological stress will be performed. The in-flight samples will allow a distinction between legitimate in-flight alterations and the physiological stresses of landing and readaptation which are believed to alter landing day assessments. The overall status of the immune system during flight (activation

  20. Validation of an analytical method for the quantitative determination of selenium in bacterial biomass by ultraviolet-visible spectrophotometry.

    PubMed

    Mörschbächer, Ana Paula; Dullius, Anja; Dullius, Carlos Henrique; Bandt, Cassiano Ricardo; Kuhn, Daniel; Brietzke, Débora Tairini; Malmann Kuffel, Fernando José; Etgeton, Henrique Pretto; Altmayer, Taciélen; Gonçalves, Tamara Engelmann; Oreste, Eliézer Quadro; Ribeiro, Anderson Schwingel; de Souza, Claucia Fernanda Volken; Hoehne, Lucélia

    2018-07-30

    The present paper describes the validation of a spectrophotometry method involving molecular absorption in the visible ultraviolet-visible (UV-Vis) region for selenium (Se) determination in the bacterial biomass produced by lactic acid bacteria (LAB). The method was found to be suitable for the target application and presented a linearity range from 0.025 to 0.250 mg/L Se. The angular and linear coefficients of the linear equation were 1.0678 and 0.0197 mg/L Se, respectively, and the linear correlation coefficient (R 2 ) was 0.9991. Analyte recovery exceeded 96% with a relative standard deviation (RSD) below 3%. The Se contents in LAB ranged from 0.01 to 20 mg/g. The Se contents in the bacterial biomass determined by UV-Vis were not significantly different (p > 0.05) those determined by graphite furnace atomic absorption spectrometry. Thus, Se can be quantified in LAB biomass using this relatively simpler technique. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. 40 CFR 86.214-94 - Analytical gases.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for 1994 and Later Model Year Gasoline-Fueled New Light-Duty Vehicles, New Light-Duty Trucks and New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.214-94 Analytical gases. The provisions of § 86...

  2. New procedure for multielemental speciation analysis of five toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples by advanced hyphenated technique HPLC/ICP-DRC-MS.

    PubMed

    Marcinkowska, Monika; Komorowicz, Izabela; Barałkiewicz, Danuta

    2016-05-12

    Analytical procedure dedicated for multielemental determination of toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples using high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-DRC-MS) technique was developed. Optimization of the detection and separation conditions was conducted. Dynamic reaction cell (DRC) with oxygen as a reaction gas was involved in the experiments. Obtained analytical signals for species separation were symmetrical, as studied by anion-exchange chromatography. Applied mobile phase consisted of 3 mM of EDTANa2 and 36 mM of ammonium nitrate. Full separation of species in the form of the following forms: H3AsO3, H2AsO4(-), SbO2(-), Sb(OH)6(-), CrO4(2-) was achieved in 15 min with use of gradient elution program. Detailed validation of analytical procedure proved the reliability of analytical measurements. The procedure was characterized by high precision in the range from 1.7% to 2.4%. Detection limits (LD) were 0.067 μg L(-1), 0.068 μg L(-1), 0.098 μg L(-1), 0.083 μg L(-1) and 0.038 μg L(-1) for As(III), As(V), Cr(VI), Sb(III) and Sb(V), respectively. Obtained recoveries confirmed the lack of interferences' influence on analytical signals as their values were in the range of 91%-110%. The applicability of the proposed procedure was tested on drinking water samples characterized by mineralization up to 650 mg L(-1). Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Procedure-specific assessment tool for flexible pharyngo-laryngoscopy: gathering validity evidence and setting pass-fail standards.

    PubMed

    Melchiors, Jacob; Petersen, K; Todsen, T; Bohr, A; Konge, Lars; von Buchwald, Christian

    2018-06-01

    The attainment of specific identifiable competencies is the primary measure of progress in the modern medical education system. The system, therefore, requires a method for accurately assessing competence to be feasible. Evidence of validity needs to be gathered before an assessment tool can be implemented in the training and assessment of physicians. This evidence of validity must according to the contemporary theory on validity be gathered from specific sources in a structured and rigorous manner. The flexible pharyngo-laryngoscopy (FPL) is central to the otorhinolaryngologist. We aim to evaluate the flexible pharyngo-laryngoscopy assessment tool (FLEXPAT) created in a previous study and to establish a pass-fail level for proficiency. Eighteen physicians with different levels of experience (novices, intermediates, and experienced) were recruited to the study. Each performed an FPL on two patients. These procedures were video recorded, blinded, and assessed by two specialists. The score was expressed as the percentage of a possible max score. Cronbach's α was used to analyze internal consistency of the data, and a generalizability analysis was performed. The scores of the three different groups were explored, and a pass-fail level was determined using the contrasting groups' standard setting method. Internal consistency was strong with a Cronbach's α of 0.86. We found a generalizability coefficient of 0.72 sufficient for moderate stakes assessment. We found a significant difference between the novice and experienced groups (p < 0.001) and strong correlation between experience and score (Pearson's r = 0.75). The pass/fail level was established at 72% of the maximum score. Applying this pass-fail level in the test population resulted in half of the intermediary group receiving a failing score. We gathered validity evidence for the FLEXPAT according to the contemporary framework as described by Messick. Our results support a claim of validity and are

  4. 4-Nonylphenol (NP) in food-contact materials: analytical methodology and occurrence.

    PubMed

    Fernandes, A R; Rose, M; Charlton, C

    2008-03-01

    Nonylphenol is a recognized environmental contaminant, but it is unclear whether its occurrence in food arises only through environmental pathways or also during the processing or packaging of food, as there are reports that indicate that materials in contact with food such as rubber products and polyvinylchloride wraps can contain nonylphenol. A review of the literature has highlighted the scarcity of robust analytical methodology or data on the occurrence of nonylphenol in packaging materials. This paper describes a methodology for the determination of nonylphenol in a variety of packaging materials, which includes plastics, paper and rubber. The method uses either Soxhlet extraction or dissolution followed by solvent extraction (depending on the material type), followed by purification using adsorption chromatography. Procedures were internally standardized using 13C-labelled nonylphenol and the analytes were measured by gas chromatography-mass spectrometry. The method is validated and data relating to quality parameters such as limits of detection, recovery, precision and linearity of measurement are provided. Analysis of a range of 25 food-contact materials found nonylphenol at concentrations of 64-287 microg g(-1) in some polystyrene and polyvinylchloride samples. Far lower concentrations (<0.03-1.4 microg g(-1)) were detected in the other materials. It is possible that occurrence at the higher levels has the potential for migration to food.

  5. Analytical validation and establishment of reference intervals for a 'high-sensitivity' cardiac troponin-T assay in horses.

    PubMed

    Shields, E; Seiden-Long, I; Massie, S; Passante, S; Leguillette, R

    2016-06-13

    Cardiac troponin-I assays have been validated in horses.'High-sensitivity' cardiac troponin assays are now the standard in human cardiology. Appropriately validate the'high-sensitivity' cardiac Troponin-T (hscTnT) assay for clinical use in horses, establish reference intervals, determine the biological variation, and demonstrate assay utility in selected clinical cases. Analytical validation of the Roche hscTnT assay included within- and between-run precision, linear dose response, limit of quantitation (LoQ), stability, and comparison with cTn-I (iSTAT). Reference intervals and biological variation were determined using adult, healthy, Non-Competition Horses (N = 125) and Racing-Thoroughbreds (N = 178). HscTnT levels were measured in two horses with cardiac pathology. The hscTnT demonstrates acceptable within-run (L1 = 6.5 ng/L, CV 14.9 %, L2 = 10.1 ng/L, CV 8.7 %, L3 = 15.3 ng/L, CV 5.4 %) and between-run precision (L1 = 12.2 ng/L, CV 8.4 %, L2 = 57.0 ng/L, CV 8.4 %, L3 = 256.0 ng/L, CV 9.0 %). The assay was linear from 3 to 391 ng/L. The LoQ was validated at 3 ng/L. Samples demonstrated insignificant decay over freeze-thaw cycle. Comparison with cTnI assay showed excellent correlation (range: 8.0-3535.0 ng/L, R(2) = 0.9996). Reference intervals: The upper 95(th) and 99(th) percentile of the hscTnT population distribution were 6.8 and 16.2 ng/L in Non-Competition Horses, and 14.0 and 23.2 ng/L in Racing-Thoroughbreds. Between-breed, diurnal effect, and between-day variation was below LoQ. Two clinical cases with presumed cardiac pathology had hscTnT levels of 220.9 ng/L and 5723.0 ng/L. This benchmark study is the first to comply with CLSI guidelines, thus further establishing the performance characteristics of the hscTnT assay, and reference intervals in healthy horses. Two clinical cases demonstrated further the clinical utility of the assay.

  6. Validation and Uncertainty Estimation of an Ecofriendly and Stability-Indicating HPLC Method for Determination of Diltiazem in Pharmaceutical Preparations

    PubMed Central

    Sadeghi, Fahimeh; Navidpour, Latifeh; Bayat, Sima; Afshar, Minoo

    2013-01-01

    A green, simple, and stability-indicating RP-HPLC method was developed for the determination of diltiazem in topical preparations. The separation was based on a C18 analytical column using a mobile phase consisted of ethanol: phosphoric acid solution (pH = 2.5) (35 : 65, v/v). Column temperature was set at 50°C and quantitation was achieved with UV detection at 240 nm. In forced degradation studies, the drug was subjected to oxidation, hydrolysis, photolysis, and heat. The method was validated for specificity, selectivity, linearity, precision, accuracy, and robustness. The applied procedure was found to be linear in diltiazem concentration range of 0.5–50 μg/mL (r 2 = 0.9996). Precision was evaluated by replicate analysis in which % relative standard deviation (RSD) values for areas were found below 2.0. The recoveries obtained (99.25%–101.66%) ensured the accuracy of the developed method. The degradation products as well as the pharmaceutical excipients were well resolved from the pure drug. The expanded uncertainty (5.63%) of the method was also estimated from method validation data. Accordingly, the proposed validated and sustainable procedure was proved to be suitable for routine analyzing and stability studies of diltiazem in pharmaceutical preparations. PMID:24163778

  7. Transport of a decay chain in homogenous porous media: analytical solutions.

    PubMed

    Bauer, P; Attinger, S; Kinzelbach, W

    2001-06-01

    With the aid of integral transforms, analytical solutions for the transport of a decay chain in homogenous porous media are derived. Unidirectional steady-state flow and radial steady-state flow in single and multiple porosity media are considered. At least in Laplace domain, all solutions can be written in closed analytical formulae. Partly, the solutions can also be inverted analytically. If not, analytical calculation of the steady-state concentration distributions, evaluation of temporal moments and numerical inversion are still possible. Formulae for several simple boundary conditions are given and visualized in this paper. The derived novel solutions are widely applicable and are very useful for the validation of numerical transport codes.

  8. The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234

    PubMed Central

    Mudge, Elizabeth M; Brown, Paula N

    2016-01-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823

  9. Procedural Error and Task Interruption

    DTIC Science & Technology

    2016-09-30

    red for research on errors and individual differences . Results indicate predictive validity for fluid intelligence and specifi c forms of work...TERMS procedural error, task interruption, individual differences , fluid intelligence, sleep deprivation 16. SECURITY CLASSIFICATION OF: 17...and individual differences . It generates rich data on several kinds of errors, including procedural errors in which steps are skipped or repeated

  10. 42 CFR 456.655 - Validation of showings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Validation of showings. 456.655 Section 456.655... Showing of an Effective Institutional Utilization Control Program § 456.655 Validation of showings. (a) The Administrator will periodically validate showings submitted under § 456.654. Validation procedures...

  11. 42 CFR 456.655 - Validation of showings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Validation of showings. 456.655 Section 456.655... Showing of an Effective Institutional Utilization Control Program § 456.655 Validation of showings. (a) The Administrator will periodically validate showings submitted under § 456.654. Validation procedures...

  12. 42 CFR 456.655 - Validation of showings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Validation of showings. 456.655 Section 456.655... Showing of an Effective Institutional Utilization Control Program § 456.655 Validation of showings. (a) The Administrator will periodically validate showings submitted under § 456.654. Validation procedures...

  13. 42 CFR 456.655 - Validation of showings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Validation of showings. 456.655 Section 456.655... Showing of an Effective Institutional Utilization Control Program § 456.655 Validation of showings. (a) The Administrator will periodically validate showings submitted under § 456.654. Validation procedures...

  14. STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...

  15. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no

  16. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  17. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  18. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    1998-11-04

    An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less

  19. Analytical methods in the high conversion reactor core design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeggel, W.; Oldekop, W.; Axmann, J.K.

    High conversion reactor (HCR) design methods have been used at the Technical University of Braunschweig (TUBS) with the technological support of Kraftwerk Union (KWU). The present state and objectives of this cooperation between KWU and TUBS in the field of HCRs have been described using existing design models and current activities aimed at further development and validation of the codes. The hard physical and thermal-hydraulic boundary conditions of pressurized water reactor (PWR) cores with a high degree of fuel utilization result from the tight packing of the HCR fuel rods and the high fissionable plutonium content of the fuel. Inmore » terms of design, the problem will be solved with rod bundles whose fuel rods are adjusted by helical spacers to the proposed small rod pitches. These HCR properties require novel computational models for neutron physics, thermal hydraulics, and fuel rod design. By means of a survey of the codes, the analytical procedure for present-day HCR core design is presented. The design programs are currently under intensive development, as design tools with a solid, scientific foundation and with essential parameters that are widely valid and are required for a promising optimization of the HCR core. Design results and a survey of future HCR development are given. In this connection, the reoptimization of the PWR core in the direction of an HCR is considered a fascinating scientific task, with respect to both economic and safety aspects.« less

  20. Relations among Conceptual Knowledge, Procedural Knowledge, and Procedural Flexibility in Two Samples Differing in Prior Knowledge

    ERIC Educational Resources Information Center

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R.

    2011-01-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of…

  1. 40 CFR 89.408 - Post-test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...

  2. 40 CFR 89.408 - Post-test procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...

  3. 40 CFR 89.408 - Post-test procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...

  4. 40 CFR 89.408 - Post-test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...

  5. 40 CFR 89.408 - Post-test procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Post-test procedures. 89.408 Section 89... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...

  6. Elemental composition of edible nuts: fast optimization and validation procedure of an ICP-OES method.

    PubMed

    Tošić, Snežana B; Mitić, Snežana S; Velimirović, Dragan S; Stojanović, Gordana S; Pavlović, Aleksandra N; Pecev-Marinković, Emilija T

    2015-08-30

    An inductively coupled plasma-optical emission spectrometry method for the speedy simultaneous detection of 19 elements in edible nuts (walnuts: Juglans nigra; almonds: Prunus dulcis; hazelnuts: Corylus avellana; Brazil nuts: Bertholletia excelsa; cashews: Anacardium occidentalle; pistachios: Pistacia vera; and peanuts: Arachis hypogaea) available on the Serbian markets, was optimized and validated through the selection of instrumental parameters and analytical lines free from spectral interference and with the lowest matrix effects. The analysed macro-elements were present in the following descending order: Na > Mg > Ca > K. Of all the trace elements, the tested samples showed the highest content of Fe. The micro-element Se was detected in all the samples of nuts. The toxic elements As, Cd and Pb were either not detected or the contents were below the limit of detection. One-way analysis of variance, Student's t-test, Tukey's HSD post hoc test and hierarchical agglomerative cluster analysis were applied in the statistical analysis of the results. Based on the detected content of analysed elements it can be concluded that nuts may be a good additional source of minerals as micronutrients. © 2014 Society of Chemical Industry.

  7. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  8. Cu determination in crude oil distillation products by atomic absorption and inductively coupled plasma mass spectrometry after analyte transfer to aqueous solution

    NASA Astrophysics Data System (ADS)

    Kowalewska, Zofia; Ruszczyńska, Anna; Bulska, Ewa

    2005-03-01

    Cu was determined in a wide range of petroleum products from crude oil distillation using flame atomic absorption spectrometry (FAAS), electrothermal atomic absorption spectrometry (ETAAS) and inductively coupled plasma mass spectrometry (ICP-MS). Different procedures of sample preparation were evaluated: (i) mineralization with sulfuric acid in an open system, (ii) mineralization in a closed microwave system, (iii) combustion in hydrogen-oxygen flame in the Wickbold's apparatus, (iv) matrix evaporation followed by acid dissolution, and (v) acidic extraction. All the above procedures led to the transfer of the analyte into an aqueous solution for the analytical measurement step. It was found that application of FAAS was limited to the analysis of the heaviest petroleum products of high Cu content. In ICP-MS, the use of internal reference method (with Rh or In as internal reference element) was required to eliminate the matrix effects in the analysis of extracts and the concentrated solutions of mineralized heavy petroleum products. The detection limits (in original samples) were equal to, respectively, 10, 86, 3.3, 0.9 and 0.4 ng g - 1 in procedures i-v with ETAAS detection and 10, 78, 1.1 and 0.5 ng g - 1 in procedures i-iii and v with ICP-MS detection. The procedures recommended here were validated by recovery experiments, certified reference materials analysis and comparison of results, obtained for a given sample, in different ways. The Cu content in the analyzed samples was: 50-110 ng g - 1 in crude oil, < 0.4-6 ng g - 1 in gasoline, < 0.5-2 ng g - 1 in atmospheric oil, < 6-100 ng g - 1 in heavy vacuum oil and 140-300 ng g - 1 in distillation residue.

  9. 29 CFR 1607.7 - Use of other validity studies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 1607.7 Use of other validity studies. A. Validity studies not conducted by the user. Users may, under certain circumstances, support the use of selection... described in test manuals. While publishers of selection procedures have a professional obligation to...

  10. 29 CFR 1607.7 - Use of other validity studies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 1607.7 Use of other validity studies. A. Validity studies not conducted by the user. Users may, under certain circumstances, support the use of selection... described in test manuals. While publishers of selection procedures have a professional obligation to...

  11. 29 CFR 1607.7 - Use of other validity studies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 1607.7 Use of other validity studies. A. Validity studies not conducted by the user. Users may, under certain circumstances, support the use of selection... described in test manuals. While publishers of selection procedures have a professional obligation to...

  12. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  13. Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.

  14. Field validation of the dnph method for aldehydes and ketones. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Workman, G.S.; Steger, J.L.

    1996-04-01

    A stationary source emission test method for selected aldehydes and ketones has been validated. The method employs a sampling train with impingers containing 2,4-dinitrophenylhydrazine (DNPH) to derivatize the analytes. The resulting hydrazones are recovered and analyzed by high performance liquid chromatography. Nine analytes were studied; the method was validated for formaldehyde, acetaldehyde, propionaldehyde, acetophenone and isophorone. Acrolein, menthyl ethyl ketone, menthyl isobutyl ketone, and quinone did not meet the validation criteria. The study employed the validation techniques described in EPA method 301, which uses train spiking to determine bias, and collocated sampling trains to determine precision. The studies were carriedmore » out at a plywood veneer dryer and a polyester manufacturing plant.« less

  15. 78 FR 56718 - Draft Guidance for Industry on Bioanalytical Method Validation; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ...] Draft Guidance for Industry on Bioanalytical Method Validation; Availability AGENCY: Food and Drug... availability of a draft guidance for industry entitled ``Bioanalytical Method Validation.'' The draft guidance is intended to provide recommendations regarding analytical method development and validation for the...

  16. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    -MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality control measures, enables sensitive, specific, reproducible, and quantitative measurements of proteins and peptides in complex biological matrices such as plasma. PMID:25693799

  17. Towards objective hand hygiene technique assessment: validation of the ultraviolet-dye-based hand-rubbing quality assessment procedure.

    PubMed

    Lehotsky, Á; Szilágyi, L; Bánsághi, S; Szerémy, P; Wéber, G; Haidegger, T

    2017-09-01

    Ultraviolet spectrum markers are widely used for hand hygiene quality assessment, although their microbiological validation has not been established. A microbiology-based assessment of the procedure was conducted. Twenty-five artificial hand models underwent initial full contamination, then disinfection with UV-dyed hand-rub solution, digital imaging under UV-light, microbiological sampling and cultivation, and digital imaging of the cultivated flora were performed. Paired images of each hand model were registered by a software tool, then the UV-marked regions were compared with the pathogen-free sites pixel by pixel. Statistical evaluation revealed that the method indicates correctly disinfected areas with 95.05% sensitivity and 98.01% specificity. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  18. Determination of Total Solids and Ash in Algal Biomass: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure describes the methods used to determine the amount of moisture or total solids present in a freeze-dried algal biomass sample, as well as the ash content. A traditional convection oven drying procedure is covered for total solids content, and a dry oxidation method at 575 deg. C is covered for ash content.

  19. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  20. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  1. SRC-I demonstration plant analytical laboratory methods manual. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klusaritz, M.L.; Tewari, K.C.; Tiedge, W.F.

    1983-03-01

    This manual is a compilation of analytical procedures required for operation of a Solvent-Refined Coal (SRC-I) demonstration or commercial plant. Each method reproduced in full includes a detailed procedure, a list of equipment and reagents, safety precautions, and, where possible, a precision statement. Procedures for the laboratory's environmental and industrial hygiene modules are not included. Required American Society for Testing and Materials (ASTM) methods are cited, and ICRC's suggested modifications to these methods for handling coal-derived products are provided.

  2. PFOS and PFOS: Analytics | Science Inventory | US EPA

    EPA Pesticide Factsheets

    This presentation describes the drivers for development of Method 537, the extraction and analytical procedure, performance data, holding time data as well as detection limits. The purpose of this presentation is to provide an overview of EPA drinking water Method 537 to the U.S. EPA Drinking Water Workshop participants.

  3. A Complete Validated Learning Analytics Framework: Designing Issues from Data Preparation Perspective

    ERIC Educational Resources Information Center

    Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing

    2018-01-01

    With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…

  4. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  5. On the relationship between the causal-inference and meta-analytic paradigms for the validation of surrogate endpoints.

    PubMed

    Alonso, Ariel; Van der Elst, Wim; Molenberghs, Geert; Buyse, Marc; Burzykowski, Tomasz

    2015-03-01

    The increasing cost of drug development has raised the demand for surrogate endpoints when evaluating new drugs in clinical trials. However, over the years, it has become clear that surrogate endpoints need to be statistically evaluated and deemed valid, before they can be used as substitutes of "true" endpoints in clinical studies. Nowadays, two paradigms, based on causal-inference and meta-analysis, dominate the scene. Nonetheless, although the literature emanating from these paradigms is wide, till now the relationship between them has largely been left unexplored. In the present work, we discuss the conceptual framework underlying both approaches and study the relationship between them using theoretical elements and the analysis of a real case study. Furthermore, we show that the meta-analytic approach can be embedded within a causal-inference framework on the one hand and that it can be heuristically justified why surrogate endpoints successfully evaluated using this approach will often be appealing from a causal-inference perspective as well, on the other. A newly developed and user friendly R package Surrogate is provided to carry out the evaluation exercise. © 2014, The International Biometric Society.

  6. Nonequilibrium chemistry boundary layer integral matrix procedure

    NASA Technical Reports Server (NTRS)

    Tong, H.; Buckingham, A. C.; Morse, H. L.

    1973-01-01

    The development of an analytic procedure for the calculation of nonequilibrium boundary layer flows over surfaces of arbitrary catalycities is described. An existing equilibrium boundary layer integral matrix code was extended to include nonequilibrium chemistry while retaining all of the general boundary condition features built into the original code. For particular application to the pitch-plane of shuttle type vehicles, an approximate procedure was developed to estimate the nonequilibrium and nonisentropic state at the edge of the boundary layer.

  7. Validation study and routine control monitoring of moist heat sterilization procedures.

    PubMed

    Shintani, Hideharu

    2012-06-01

    The proposed approach to validation of steam sterilization in autoclaves follows the basic life cycle concepts applicable to all validation programs. Understand the function of sterilization process, develop and understand the cycles to carry out the process, and define a suitable test or series of tests to confirm that the function of the process is suitably ensured by the structure provided. Sterilization of product and components and parts that come in direct contact with sterilized product is the most critical of pharmaceutical processes. Consequently, this process requires a most rigorous and detailed approach to validation. An understanding of the process requires a basic understanding of microbial death, the parameters that facilitate that death, the accepted definition of sterility, and the relationship between the definition and sterilization parameters. Autoclaves and support systems need to be designed, installed, and qualified in a manner that ensures their continued reliability. Lastly, the test program must be complete and definitive. In this paper, in addition to validation study, documentation of IQ, OQ and PQ concretely were described.

  8. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  9. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  10. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  11. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    PubMed

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  12. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. L-shaped piezoelectric motor--part II: analytical modeling.

    PubMed

    Avirovik, Dragan; Karami, M Amin; Inman, Daniel; Priya, Shashank

    2012-01-01

    This paper develops an analytical model for an L-shaped piezoelectric motor. The motor structure has been described in detail in Part I of this study. The coupling of the bending vibration mode of the bimorphs results in an elliptical motion at the tip. The emphasis of this paper is on the development of a precise analytical model which can predict the dynamic behavior of the motor based on its geometry. The motor was first modeled mechanically to identify the natural frequencies and mode shapes of the structure. Next, an electromechanical model of the motor was developed to take into account the piezoelectric effect, and dynamics of L-shaped piezoelectric motor were obtained as a function of voltage and frequency. Finally, the analytical model was validated by comparing it to experiment results and the finite element method (FEM). © 2012 IEEE

  14. 21 CFR 606.100 - Standard operating procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or hepatitis C virus (HCV) infection when tested under § 610.40 of this chapter, or when a blood... viral clearance procedures; (iii) To notify consignees to quarantine in-date blood and blood components... are manufactured using validated viral clearance procedures; (iv) To determine the suitability for...

  15. Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective

    NASA Astrophysics Data System (ADS)

    Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.

    2009-04-01

    Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the

  16. Analytic theory of orbit contraction

    NASA Technical Reports Server (NTRS)

    Vinh, N. X.; Longuski, J. M.; Busemann, A.; Culp, R. D.

    1977-01-01

    The motion of a satellite in orbit, subject to atmospheric force and the motion of a reentry vehicle are governed by gravitational and aerodynamic forces. This suggests the derivation of a uniform set of equations applicable to both cases. For the case of satellite motion, by a proper transformation and by the method of averaging, a technique appropriate for long duration flight, the classical nonlinear differential equation describing the contraction of the major axis is derived. A rigorous analytic solution is used to integrate this equation with a high degree of accuracy, using Poincare's method of small parameters and Lagrange's expansion to explicitly express the major axis as a function of the eccentricity. The solution is uniformly valid for moderate and small eccentricities. For highly eccentric orbits, the asymptotic equation is derived directly from the general equation. Numerical solutions were generated to display the accuracy of the analytic theory.

  17. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  18. An analytical model of leakage neutron equivalent dose for passively-scattered proton radiotherapy and validation with measurements.

    PubMed

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose  at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  19. The Importance of Method Selection in Determining Product Integrity for Nutrition Research.

    PubMed

    Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N

    2016-03-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.

  20. Differential Validation of a Path Analytic Model of University Dropout.

    ERIC Educational Resources Information Center

    Winteler, Adolf

    Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…