Sample records for factor analytic procedure

  1. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Laboratory Analytical Procedures | Bioenergy | NREL

    Science.gov Websites

    analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing

  4. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  5. 40 CFR 140.5 - Analytical procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...

  6. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    PubMed

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  9. Digital forensics: an analytical crime scene procedure model (ACSPM).

    PubMed

    Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut

    2013-12-10

    In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic

  10. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    PubMed

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  11. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    PubMed

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  12. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  13. An analytic survey of signing inventory procedures in Virginia.

    DOT National Transportation Integrated Search

    1972-01-01

    An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...

  14. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...

  15. An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.

    PubMed

    Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A

    2000-05-01

    The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.

  16. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  17. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  18. Harmonization of strategies for the validation of quantitative analytical procedures. A SFSTP proposal--Part I.

    PubMed

    Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L

    2004-11-15

    This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory

  19. ESTIMATING UNCERTAINITIES IN FACTOR ANALYTIC MODELS

    EPA Science Inventory

    When interpreting results from factor analytic models as used in receptor modeling, it is important to quantify the uncertainties in those results. For example, if the presence of a species on one of the factors is necessary to interpret the factor as originating from a certain ...

  20. Critical Factors in Data Governance for Learning Analytics

    ERIC Educational Resources Information Center

    Elouazizi, Noureddine

    2014-01-01

    This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…

  1. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    PubMed

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  2. Analytical and experimental procedures for determining propagation characteristics of millimeter-wave gallium arsenide microstrip lines

    NASA Technical Reports Server (NTRS)

    Romanofsky, Robert R.

    1989-01-01

    In this report, a thorough analytical procedure is developed for evaluating the frequency-dependent loss characteristics and effective permittivity of microstrip lines. The technique is based on the measured reflection coefficient of microstrip resonator pairs. Experimental data, including quality factor Q, effective relative permittivity, and fringing for 50-omega lines on gallium arsenide (GaAs) from 26.5 to 40.0 GHz are presented. The effects of an imperfect open circuit, coupling losses, and loading of the resonant frequency are considered. A cosine-tapered ridge-guide text fixture is described. It was found to be well suited to the device characterization.

  3. 14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE... Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...

  4. Consistency of FMEA used in the validation of analytical procedures.

    PubMed

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures.

    PubMed

    Montgomery, L D; Montgomery, R W; Guisado, R

    1995-05-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  7. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.; Montgomery, R. W.; Guisado, R.

    1995-01-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  8. Significance Testing in Confirmatory Factor Analytic Models.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; Hocevar, Dennis

    Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…

  9. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  10. Methods for Estimating Uncertainty in Factor Analytic Solutions

    EPA Science Inventory

    The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DI...

  11. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  12. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  13. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  14. 40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  15. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...

  16. Development of an Analytical Procedure for the Determination of Multiclass Compounds for Forensic Veterinary Toxicology.

    PubMed

    Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej

    2018-04-01

    Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients <15% have been achieved for most of the analyzed substances. Analytical conditions allowed for a successful separation of variety of poisons with the typical screening detection limit at ≤10 μg/kg levels. The method was used to investigate more than 100 animals poisoning incidents and proved that is useful to be used in animal forensic toxicology cases.

  17. The Analytical Pragmatic Structure of Procedural Due Process: A Framework for Inquiry in Administrative Decision Making.

    ERIC Educational Resources Information Center

    Fisher, James E.; Sealey, Ronald W.

    The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…

  18. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  19. Extension Procedures for Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Nagy, Gabriel; Brunner, Martin; Lüdtke, Oliver; Greiff, Samuel

    2017-01-01

    We present factor extension procedures for confirmatory factor analysis that provide estimates of the relations of common and unique factors with external variables that do not undergo factor analysis. We present identification strategies that build upon restrictions of the pattern of correlations between unique factors and external variables. The…

  20. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of

  1. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    PubMed

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  2. Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.

    PubMed

    Blake, Christopher J

    2007-09-01

    Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.

  3. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies.

    PubMed

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S; Singh, Rajesh R; Roy-Chowdhuri, Sinchita

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  4. Differences in metabolite profiles caused by pre-analytical blood processing procedures.

    PubMed

    Nishiumi, Shin; Suzuki, Makoto; Kobayashi, Takashi; Yoshida, Masaru

    2018-05-01

    Recently, the use of metabolomic analysis of human serum and plasma for biomarker discovery and disease diagnosis in clinical studies has been increasing. The feasibility of using a metabolite biomarker for disease diagnosis is strongly dependent on the metabolite's stability during pre-analytical blood processing procedures, such as serum or plasma sampling and sample storage prior to centrifugation. However, the influence of blood processing procedures on the stability of metabolites has not been fully characterized. In the present study, we compared the levels of metabolites in matched human serum and plasma samples using gas chromatography coupled with mass spectrometry and liquid chromatography coupled with mass spectrometry. In addition, we evaluated the changes in plasma metabolite levels induced by storage at room temperature or at a cold temperature prior to centrifugation. As a result, it was found that 76 metabolites exhibited significant differences between their serum and plasma levels. Furthermore, the pre-centrifugation storage conditions significantly affected the plasma levels of 45 metabolites. These results highlight the importance of blood processing procedures during metabolome analysis, which should be considered during biomarker discovery and the subsequent use of biomarkers for disease diagnosis. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  5. A factor analytic investigation of the Mercy Evaluation of Multiple Sclerosis.

    PubMed

    Merz, Zachary C; Wright, John D; Vander Wal, Jillon S; Gfeller, Jeffrey D

    2018-01-23

    Neurocognitive deficits commonly are an accompanying feature of Multiple Sclerosis (MS). A brief, yet comprehensive neuropsychological battery is desirable for assessing the extent of these deficits. Therefore, the present study examined the validity of the Mercy Evaluation of Multiple Sclerosis (MEMS) for use with the MS population. Archival data from individuals diagnosed with MS (N = 378) by independent neurologists was examined. Cognitive domains assessed included processing speed and attention, learning, and memory, visuospatial, language, and executive functioning. A mean battery index was calculated to provide a general indicator of cognitive impairment within the current sample. Overall performance across participants was found to be in the lower limits of the average range. Results of factor analytic statistical procedures yielded a four-factor solution, accounting for 67% of total variance within the MEMS. Four neurocognitive measures exhibited the highest sensitivity in detecting cognitive impairment, constituting a psychometrically established brief cognitive screening battery, which accounted for 83% of total variance within the mean battery index score. Overall, the results of the current study suggest appropriate construct validity of the MEMS for use with individuals with MS, as well as provide support for previously established cognitive batteries.

  6. Laboratory, Field, and Analytical Procedures for Using ...

    EPA Pesticide Factsheets

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas

  7. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  8. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  9. Analytic Couple Modeling Introducing Device Design Factor, Fin Factor, Thermal Diffusivity Factor, and Inductance Factor

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    A set of convenient thermoelectric device solutions have been derived in order to capture a number of factors which are previously only resolved with numerical techniques. The concise conversion efficiency equations derived from governing equations provide intuitive and straight-forward design guidelines. These guidelines allow for better device design without requiring detailed numerical modeling. The analytical modeling accounts for factors such as i) variable temperature boundary conditions, ii) lateral heat transfer, iii) temperature variable material properties, and iv) transient operation. New dimensionless parameters, similar to the figure of merit, are introduced including the device design factor, fin factor, thermal diffusivity factor, and inductance factor. These new device factors allow for the straight-forward description of phenomenon generally only captured with numerical work otherwise. As an example a device design factor of 0.38, which accounts for thermal resistance of the hot and cold shoes, can be used to calculate a conversion efficiency of 2.28 while the ideal conversion efficiency based on figure of merit alone would be 6.15. Likewise an ideal couple with efficiency of 6.15 will be reduced to 5.33 when lateral heat is accounted for with a fin factor of 1.0.

  10. 77 FR 39895 - New Analytic Methods and Sampling Procedures for the United States National Residue Program for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...

  11. An analytically based numerical method for computing view factors in real urban environments

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  12. Development of an analytical procedure to study linear alkylbenzenesulphonate (LAS) degradation in sewage sludge-amended soils.

    PubMed

    Comellas, L; Portillo, J L; Vaquero, M T

    1993-12-24

    A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.

  13. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  14. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  15. Exploring the Different Trajectories of Analytical Thinking Ability Factors: An Application of the Second-Order Growth Curve Factor Model

    ERIC Educational Resources Information Center

    Saengprom, Narumon; Erawan, Waraporn; Damrongpanit, Suntonrapot; Sakulku, Jaruwan

    2015-01-01

    The purposes of this study were 1) Compare analytical thinking ability by testing the same sets of students 5 times 2) Develop and verify whether analytical thinking ability of students corresponds to second-order growth curve factors model. Samples were 1,093 eighth-grade students. The results revealed that 1) Analytical thinking ability scores…

  16. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    PubMed

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  17. Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.

    PubMed

    Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F

    2016-01-01

    Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence

  18. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    PubMed

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  19. Transcription factor-based biosensors enlightened by the analyte

    PubMed Central

    Fernandez-López, Raul; Ruiz, Raul; de la Cruz, Fernando; Moncalián, Gabriel

    2015-01-01

    Whole cell biosensors (WCBs) have multiple applications for environmental monitoring, detecting a wide range of pollutants. WCBs depend critically on the sensitivity and specificity of the transcription factor (TF) used to detect the analyte. We describe the mechanism of regulation and the structural and biochemical properties of TF families that are used, or could be used, for the development of environmental WCBs. Focusing on the chemical nature of the analyte, we review TFs that respond to aromatic compounds (XylS-AraC, XylR-NtrC, and LysR), metal ions (MerR, ArsR, DtxR, Fur, and NikR) or antibiotics (TetR and MarR). Analyzing the structural domains involved in DNA recognition, we highlight the similitudes in the DNA binding domains (DBDs) of these TF families. Opposite to DBDs, the wide range of analytes detected by TFs results in a diversity of structures at the effector binding domain. The modular architecture of TFs opens the possibility of engineering TFs with hybrid DNA and effector specificities. Yet, the lack of a crisp correlation between structural domains and specific functions makes this a challenging task. PMID:26191047

  20. Transcription factor-based biosensors enlightened by the analyte.

    PubMed

    Fernandez-López, Raul; Ruiz, Raul; de la Cruz, Fernando; Moncalián, Gabriel

    2015-01-01

    Whole cell biosensors (WCBs) have multiple applications for environmental monitoring, detecting a wide range of pollutants. WCBs depend critically on the sensitivity and specificity of the transcription factor (TF) used to detect the analyte. We describe the mechanism of regulation and the structural and biochemical properties of TF families that are used, or could be used, for the development of environmental WCBs. Focusing on the chemical nature of the analyte, we review TFs that respond to aromatic compounds (XylS-AraC, XylR-NtrC, and LysR), metal ions (MerR, ArsR, DtxR, Fur, and NikR) or antibiotics (TetR and MarR). Analyzing the structural domains involved in DNA recognition, we highlight the similitudes in the DNA binding domains (DBDs) of these TF families. Opposite to DBDs, the wide range of analytes detected by TFs results in a diversity of structures at the effector binding domain. The modular architecture of TFs opens the possibility of engineering TFs with hybrid DNA and effector specificities. Yet, the lack of a crisp correlation between structural domains and specific functions makes this a challenging task.

  1. Human Factors Considerations for Area Navigation Departure and Arrival Procedures

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Adams, Catherine A.

    2006-01-01

    Area navigation (RNAV) procedures are being implemented in the United States and around the world as part of a transition to a performance-based navigation system. These procedures are providing significant benefits and have also caused some human factors issues to emerge. Under sponsorship from the Federal Aviation Administration (FAA), the National Aeronautics and Space Administration (NASA) has undertaken a project to document RNAV-related human factors issues and propose areas for further consideration. The component focusing on RNAV Departure and Arrival Procedures involved discussions with expert users, a literature review, and a focused review of the NASA Aviation Safety Reporting System (ASRS) database. Issues were found to include aspects of air traffic control and airline procedures, aircraft systems, and procedure design. Major findings suggest the need for specific instrument procedure design guidelines that consider the effects of human performance. Ongoing industry and government activities to address air-ground communication terminology, design improvements, and chart-database commonality are strongly encouraged. A review of factors contributing to RNAV in-service errors would likely lead to improved system design and operational performance.

  2. Analytical Procedures for Testability.

    DTIC Science & Technology

    1983-01-01

    Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum

  3. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR NEUTRAL TARGET ANALYTES (SOP-5.25)

    EPA Science Inventory

    This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.

  5. CTEPP STANDARD OPERATING PROCEDURE FOR DETECTION AND QUANTIFICATION OF TARGET ANALYTES BY GAS CHROMATOGRAPHY/MASS SPECTROMETRY (GC/MS) (SOP-5.24)

    EPA Science Inventory

    This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.

  6. FACTOR ANALYTIC MODELS OF CLUSTERED MULTIVARIATE DATA WITH INFORMATIVE CENSORING

    EPA Science Inventory

    This paper describes a general class of factor analytic models for the analysis of clustered multivariate data in the presence of informative missingness. We assume that there are distinct sets of cluster-level latent variables related to the primary outcomes and to the censorin...

  7. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10/sup -1/ and 10/sup -3/ percent are achieved. The experimental field work consisted of performing special tests in a large test sand fillmore » to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results.« less

  8. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  9. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  10. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  11. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...

  12. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  13. Abdominoplasty: Risk Factors, Complication Rates, and Safety of Combined Procedures.

    PubMed

    Winocour, Julian; Gupta, Varun; Ramirez, J Roberto; Shack, R Bruce; Grotting, James C; Higdon, K Kye

    2015-11-01

    Among aesthetic surgery procedures, abdominoplasty is associated with a higher complication rate, but previous studies are limited by small sample sizes or single-institution experience. A cohort of patients who underwent abdominoplasty between 2008 and 2013 was identified from the CosmetAssure database. Major complications were recorded. Univariate and multivariate analysis was performed evaluating risk factors, including age, smoking, body mass index, sex, diabetes, type of surgical facility, and combined procedures. The authors identified 25,478 abdominoplasties from 183,914 procedures in the database. Of these, 8,975 patients had abdominoplasty alone and 16,503 underwent additional procedures. The number of complications recorded was 1,012 (4.0 percent overall rate versus 1.4 percent in other aesthetic surgery procedures). Of these, 31.5 percent were hematomas, 27.2 percent were infections and 20.2 percent were suspected or confirmed venous thromboembolism. On multivariate analysis, significant risk factors (p < 0.05) included male sex (relative risk, 1.8), age 55 years or older (1.4), body mass index greater than or equal to 30 (1.3), multiple procedures (1.5), and procedure performance in a hospital or surgical center versus office-based surgical suite (1.6). Combined procedures increased the risk of complication (abdominoplasty alone, 3.1 percent; with liposuction, 3.8 percent; breast procedure, 4.3 percent; liposuction and breast procedure, 4.6 percent; body-contouring procedure, 6.8 percent; liposuction and body-contouring procedure, 10.4 percent). Abdominoplasty is associated with a higher complication rate compared with other aesthetic procedures. Combined procedures can significantly increase complication rates and should be considered carefully in higher risk patients. Risk, II.

  14. General Procedure for the Easy Calculation of pH in an Introductory Course of General or Analytical Chemistry

    ERIC Educational Resources Information Center

    Cepriá, Gemma; Salvatella, Luis

    2014-01-01

    All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…

  15. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  16. Factor-Analytic and Individualized Approaches to Constructing Brief Measures of ADHD Behaviors

    ERIC Educational Resources Information Center

    Volpe, Robert J.; Gadow, Kenneth D.; Blom-Hoffman, Jessica; Feinberg, Adam B.

    2009-01-01

    Two studies were performed to examine a factor-analytic and an individualized approach to creating short progress-monitoring measures from the longer "ADHD-Symptom Checklist-4" (ADHD-SC4). In Study 1, teacher ratings on items of the ADHD:Inattentive (IA) and ADHD:Hyperactive-Impulsive (HI) scales of the ADHD-SC4 were factor analyzed in a normative…

  17. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  18. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  19. Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines

    DTIC Science & Technology

    1989-09-01

    Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer F ( Abstract In this...Projects Agency under contract number N00014-87-K-0825. Author Information Devadas : Department of Electrical Engineering and Computer Science, Room 36...MA 02139; (617) 253-0292. 0 * Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Siivas Devadas

  20. Pre-analytical Factors Influence Accuracy of Urine Spot Iodine Assessment in Epidemiological Surveys.

    PubMed

    Doggui, Radhouene; El Ati-Hellal, Myriam; Traissac, Pierre; El Ati, Jalila

    2018-03-26

    Urinary iodine concentration (UIC) is commonly used to assess iodine status of subjects in epidemiological surveys. As pre-analytical factors are an important source of measurement error and studies about this phase are scarce, our objective was to assess the influence of urine sampling conditions on UIC, i.e., whether the child ate breakfast or not, urine void rank of the day, and time span between last meal and urine collection. A nationwide, two-stage, stratified, cross-sectional study including 1560 children (6-12 years) was performed in 2012. UIC was determined by the Sandell-Kolthoff method. Pre-analytical factors were assessed from children's mothers by using a questionnaire. Association between iodine status and pre-analytical factors were adjusted for one another and socio-economic characteristics by multivariate linear and multinomial regression models (RPR: relative prevalence ratios). Skipping breakfast prior to morning urine sampling decreased UIC by 40 to 50 μg/L and the proportion of UIC < 100 μg/L was higher among children having those skipped breakfast (RPR = 3.2[1.0-10.4]). In unadjusted analyses, UIC was less among children sampled more than 5 h from their last meal. UIC decreased with rank of urine void (e.g., first vs. second, P < 0.001); also, the proportion of UIC < 100 μg/L was greater among 4th rank samples (vs. second RPR = 2.1[1.1-4.0]). Subjects' breakfast status and urine void rank should be accounted for when assessing iodine status. Providing recommendations to standardize pre-analytical factors is a key step toward improving accuracy and comparability of survey results for assessing iodine status from spot urine samples. These recommendations have to be evaluated by future research.

  1. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. An Exploratory Investigation of the Factor Structure of the Reynolds Intellectual Assessment Scales (RIAS)

    ERIC Educational Resources Information Center

    Dombrowski, Stefan C.; Watkins, Marley W.; Brogan, Michael J.

    2009-01-01

    This study investigated the factor structure of the Reynolds Intellectual Assessment Scales (RIAS) using rigorous exploratory factor analytic and factor extraction procedures. The results of this study indicate that the RIAS is a single factor test. Despite these results, higher order factor analysis using the Schmid-Leiman procedure indicates…

  3. ASVCP quality assurance guidelines: control of general analytical factors in veterinary laboratories.

    PubMed

    Flatland, Bente; Freeman, Kathy P; Friedrichs, Kristen R; Vap, Linda M; Getzy, Karen M; Evans, Ellen W; Harr, Kendal E

    2010-09-01

    Owing to lack of governmental regulation of veterinary laboratory performance, veterinarians ideally should demonstrate a commitment to self-monitoring and regulation of laboratory performance from within the profession. In response to member concerns about quality management in veterinary laboratories, the American Society for Veterinary Clinical Pathology (ASVCP) formed a Quality Assurance and Laboratory Standards (QAS) committee in 1996. This committee recently published updated and peer-reviewed Quality Assurance Guidelines on the ASVCP website. The Quality Assurance Guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports on 1) general analytic factors for veterinary laboratory performance and comparisons, 2) hematology and hemostasis, and 3) clinical chemistry, endocrine assessment, and urinalysis. This report documents recommendations for control of general analytical factors within veterinary clinical laboratories and is based on section 2.1 (Analytical Factors Important In Veterinary Clinical Pathology, General) of the newly revised ASVCP QAS Guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimum guidelines for quality assurance and quality control for veterinary laboratory testing. It is hoped that these guidelines will provide a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. ©2010 American Society for Veterinary Clinical Pathology.

  4. Analytical procedures for the determination of fuel combustion products, anti-corrosive compounds, and de-icing compounds in airport runoff water samples.

    PubMed

    Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek

    2013-12-15

    The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.

  5. Clarivate Analytics: Continued Omnia vanitas Impact Factor Culture.

    PubMed

    Teixeira da Silva, Jaime A; Bernès, Sylvain

    2018-02-01

    This opinion paper takes aim at an error made recently by Clarivate Analytics in which it sent out an email that congratulated academics for becoming exclusive members of academia's most cited elite, the Highly Cited Researchers (HCRs). However, that email was sent out to an undisclosed number of non-HCRs, who were offered an apology shortly after, through a bulk mail, which tried to down-play the importance of the error, all the while praising the true HCRs. When Clarivate Analytics senior management was contacted, the company declined to offer an indication of the number of academics who had been contacted and erroneously awarded the HCR status. We believe that this regrettable blunder, together with the opacity offered by the company, fortify the corporate attitude about the value of the journal impact factor (JIF), and what it represents, namely a marketing tool that is falsely used to equate citations with quality, worth, or influence. The continued commercialization of metrics such as the JIF is at the heart of their use to assess the "quality" of a researcher, their work, or a journal, and contributes to a great extent to driving scientific activities towards a futile endeavor.

  6. Risk factors for postoperative urinary tract infection following midurethral sling procedures.

    PubMed

    Doganay, Melike; Cavkaytar, Sabri; Kokanali, Mahmut Kuntay; Ozer, Irfan; Aksakal, Orhan Seyfi; Erkaya, Salim

    2017-04-01

    To identify the potential risk factors for urinary tract infections following midurethral sling procedures. 556 women who underwent midurethral sling procedure due to stress urinary incontinence over a four-year period were reviewed in this retrospective study. Of the study population, 280 women underwent TVT procedures and 276 women underwent TOT procedures. Patients were evaluated at 4-8 weeks postoperatively and were investigated for the occurrence of a urinary tract infection. Patients who experienced urinary tract infection were defined as cases, and patients who didn't were defined as controls. All data were collected from medical records. Multivariate logistic regression model was used to identify the risk factors for urinary tract infection. Of 556 women, 58 (10.4%) were defined as cases while 498 (89.6%) were controls. The mean age of women in cases (57.8±12.9years) was significantly greater than in controls (51.8±11.2years) (p<0.001). The presence of menopausal status, previous abdominal surgery, preoperative antibiotic treatment due to urinary tract infection, concomitant vaginal hysterectomy and cystocele repair, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml were more common in cases than in controls. However, in multivariate regression analysis model presence of preoperative urinary tract infection [OR (95% CI)=0.1 (0.1-0.7); p=0.013], TVT procedure [OR (95% CI)=8.4 (3.1-22.3); p=0.000] and postoperative postvoiding residual bladder volume ≥100ml [OR (95% CI)=4.6 (1.1-19.2); p=0.036] were significant independent risk factors for urinary tract infection following midurethral slings CONCLUSION: Urinary tract infection after midurethral sling procedures is a relatively common complication. The presence of preoperative urinary tract infection, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml may increase the risk of this complication. Identification of these factors could help surgeons to

  7. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures.

    PubMed

    Ferrario, J; Byrne, C; Dupuy, A E

    1997-06-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  8. Background contamination by coplanar polychlorinated biphenyls (PCBs) in trace level high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) analytical procedures

    NASA Technical Reports Server (NTRS)

    Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr

    1997-01-01

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.

  9. Groin hematoma after electrophysiological procedures-incidence and predisposing factors.

    PubMed

    Dalsgaard, Anja Borgen; Jakobsen, Christina Spåbæk; Riahi, Sam; Hjortshøj, Søren

    2014-10-01

    We evaluated the incidence and predisposing factors of groin hematomas after electrophysiological (EP) procedures. Prospective, observational study, enrolling consecutive patients after EP procedures (Atrial fibrillation: n = 151; Supraventricular tachycardia/Diagnostic EP: n = 82; Ventricular tachycardia: n = 18). Patients underwent manual compression for 10 min and 3 h post procedural bed rest. AF ablations were performed with INR 2-3, ACT > 300, and no protamine sulfate. Adhesive pressure dressings (APDs) were used if sheath size ≥ 10F; procedural time > 120 min; and BMI > 30. Patient-reported hematomas were recorded by a telephone follow-up after 2 weeks. Hematoma developed immediately in 26 patients (10%) and after 14 days significant hematoma was reported in 68 patients (27%). Regression analysis on sex, age, BMI 25, ACT 300, use of APD, sheath size and number, and complicated venous access was not associated with hematoma, either immediately after the procedure or after 14 days. Any hematoma presenting immediately after procedures was associated with patient-reported hematomas after 14 days, odds ratio 18.7 (CI 95%: 5.00-69.8; P < 0.001). Any hematoma immediately after EP procedures was the sole predictor of patient-reported hematoma after 2 weeks. Initiatives to prevent groin hematoma should focus on the procedure itself as well as post-procedural care.

  10. Background Contamination by Coplanar Polychlorinated Biphenyls (PCBS) in Trace Level High Resolution Gas Chromatography/High Resolution Mass Spectrometry (HRGC/HRMS) Analytical Procedures

    EPA Science Inventory

    The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...

  11. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  12. Factors Affecting Higher Order Thinking Skills of Students: A Meta-Analytic Structural Equation Modeling Study

    ERIC Educational Resources Information Center

    Budsankom, Prayoonsri; Sawangboon, Tatsirin; Damrongpanit, Suntorapot; Chuensirimongkol, Jariya

    2015-01-01

    The purpose of the research is to develop and identify the validity of factors affecting higher order thinking skills (HOTS) of students. The thinking skills can be divided into three types: analytical, critical, and creative thinking. This analysis is done by applying the meta-analytic structural equation modeling (MASEM) based on a database of…

  13. Current outcomes and risk factors for the Norwood procedure.

    PubMed

    Stasik, Chad N; Gelehrter, S; Goldberg, Caren S; Bove, Edward L; Devaney, Eric J; Ohye, Richard G

    2006-02-01

    Tremendous strides have been made in the outcomes for hypoplastic left heart syndrome and other functional single-ventricle malformations over the past 25 years. This progress relates primarily to improvements in survival for patients undergoing the Norwood procedure. Previous reports on risk factors have been on smaller groups of patients or collected over relatively long periods of time, during which management has evolved. We analyzed our current results for the Norwood procedure with attention to risk factors for poor outcome. A single-institution review of all patients undergoing a Norwood procedure for a single-ventricle malformation from May 1, 2001, through April 30, 2003, was performed. Patient demographics, anatomy, clinical condition, associated anomalies, operative details, and outcomes were recorded. Of the 111 patients, there were 23 (21%) hospital deaths. Univariate analysis revealed noncardiac abnormalities (genetic or significant extracardiac diagnosis, P = .0018), gestational age (P = .03), diagnosis of unbalanced atrioventricular septal defect (P = .017), and weight of less than 2.5 kg (P = .0072) to be related to hospital death. On multivariate analysis, only weight of less than 2.5 kg and noncardiac abnormalities were found to be independent risk factors. Patients with either of these characteristics had a hospital survival of 52% (12/23), whereas those at standard risk had a survival of 86% (76/88). Although improvements in management might have lessened the effect of some of the traditionally reported risk factors related to variations in the cardiovascular anatomy, noncardiac abnormalities and low birth weight remain as a future challenge for the physician caring for the patient with single-ventricle physiology.

  14. Numerical procedure to determine geometric view factors for surfaces occluded by cylinders

    NASA Technical Reports Server (NTRS)

    Sawyer, P. L.

    1978-01-01

    A numerical procedure was developed to determine geometric view factors between connected infinite strips occluded by any number of infinite circular cylinders. The procedure requires a two-dimensional cross-sectional model of the configuration of interest. The two-dimensional model consists of a convex polygon enclosing any number of circles. Each side of the polygon represents one strip, and each circle represents a circular cylinder. A description and listing of a computer program based on this procedure are included in this report. The program calculates geometric view factors between individual strips and between individual strips and the collection of occluding cylinders.

  15. Beyond Engagement Analytics: Which Online Mixed-Data Factors Predict Student Learning Outcomes?

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2017-01-01

    This mixed-method study focuses on online learning analytics, a research area of importance. Several important student attributes and their online activities are examined to identify what seems to work best to predict higher grades. The purpose is to explore the relationships between student grade and key learning engagement factors using a large…

  16. Solid sorbent air sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elskamp, C.J.; Schultz, G.R.

    1986-01-01

    A sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine was developed in order to avoid problems typically encountered in the sampling and analysis of low molecular weight aliphatic amines. Samples are collected with adsorbent tubes containing Amberlite XAD-7 resin coated with the derivatizing reagent, NBD chloride (7-chloro-4-nitrobenzo-2-oxa-1,3-diazole). Analysis is performed by high performance liquid chromatography with the use of a fluorescence and/or UV/visible detector. All four amines can be monitored simultaneously, and neither collection nor storage is affected by humidity. Samples are stable at room temperature for at least two weeks. The methodology has been tested for eachmore » of the four amines at sample loadings equivalent to air concentration ranges of 0.5 to 30 ppm for a sample volume of 10 liters. The method shows promise for determining other airborne primary and secondary low molecular weight aliphatic amines.« less

  17. Risk factors and serological markers of liver cirrhosis after Fontan procedure.

    PubMed

    Shimizu, Mikiko; Miyamoto, Kenji; Nishihara, Yunosuke; Izumi, Gaku; Sakai, Shuji; Inai, Kei; Nishikawa, Toshio; Nakanishi, Toshio

    2016-09-01

    Liver cirrhosis (LC), which may result in hepatic failure or cancer, has been reported in patients after Fontan procedure. The purpose of this study was to clarify the frequency and histological characteristics of LC, and to evaluate the risk factors and serological markers of LC with Fontan circulation. Retrospective review of contrast-enhanced CT scans (CT) of the liver was carried out in 57 patients after Fontan procedure. Patients were divided into two groups: LC group (n = 31) and no LC group (n = 26). Age at Fontan procedure, duration after Fontan procedure, catheterization data, and history of failing Fontan circulation were compared between groups. Serological data including γ-GTP and hyaluronic acid were compared. Histology of autopsy specimens was assessed when available. Duration after Fontan procedure was significantly longer in LC group than no LC group. History of failing Fontan circulation was more frequent in LC group than in no LC group. There was no correlation between type of procedure (APC/Bjork/lateral tunnel/TCPC) and LC in this series. Serum hyaluronic acid, γ-GTP, and Forns index were significantly higher in LC group. Significant risk factors for LC were duration after Fontan procedure (>20 years). In autopsy specimens, histopathological changes of LC were observed predominantly in the central venous area. LC diagnosed with CT is frequent in patients long after Fontan procedure, especially after 20 years. Hyaluronic acid and γ-GTP could be useful markers to monitor the progression of liver fibrosis in Fontan patients.

  18. Risk Factors Analysis for Occurrence of Asymptomatic Bacteriuria After Endourological Procedures

    PubMed Central

    Junuzovic, Dzelaludin; Hasanbegovic, Munira

    2014-01-01

    Introduction: Endourological procedures are performed according to the principles of aseptic techniques, jet still in certain number of patients urinary tract infections may occur. Considering the risk of urinary tract infection, there is no unique opinion about the prophylactic use of antibiotics in endourological procedures. Goal: The objective of this study was to determine the connection between endourological procedures and occurrence of urinary infections and to analyze the risk factors of urinary infection for patients who were hospitalized at the Urology Clinic of the Clinical Center University of Sarajevo CCUS. Materials and Methods: The research was conducted as a prospective study on a sample of 208 patients of both genders, who were hospitalized at the Urology Clinic of the CCUS and to whom some endourological procedure was indicated for diagnostic or therapeutic purposes. We analyzed data from patient’s histories of illness, laboratory tests taken at admission and after endourological procedures, also surgical programs for endoscopic procedures. All patients were clinically examined prior to endoscopic procedures while after the treatment attention was focused to the symptoms of urinary tract infections. Results: Statistical analysis of the tested patients indicates that there is no significant difference in the presence of postoperative, compared to preoperative bacteriuria, which implies that the endourological procedures are safe procedures in terms of urinary tract infections. Preoperatively, the most commonly isolated bacteria was Escherichia coli (30.9%) and postoperatively, Enterococcus faecalis (25%). Statistically significant effect on the occurrence of postoperative bacteriuria has preoperative bacteriuria, duration of postoperative catheterization, and duration of hospitalization. Conclusion: In everyday urological practice, it is very important to identify and control risk factors for the development of urinary infection after

  19. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  20. Proposal of a risk-factor-based analytical approach for integrating occupational health and safety into project risk evaluation.

    PubMed

    Badri, Adel; Nadeau, Sylvie; Gbodossou, André

    2012-09-01

    Excluding occupational health and safety (OHS) from project management is no longer acceptable. Numerous industrial accidents have exposed the ineffectiveness of conventional risk evaluation methods as well as negligence of risk factors having major impact on the health and safety of workers and nearby residents. Lack of reliable and complete evaluations from the beginning of a project generates bad decisions that could end up threatening the very existence of an organization. This article supports a systematic approach to the evaluation of OHS risks and proposes a new procedure based on the number of risk factors identified and their relative significance. A new concept called risk factor concentration along with weighting of risk factor categories as contributors to undesirable events are used in the analytical hierarchy process multi-criteria comparison model with Expert Choice(©) software. A case study is used to illustrate the various steps of the risk evaluation approach and the quick and simple integration of OHS at an early stage of a project. The approach allows continual reassessment of criteria over the course of the project or when new data are acquired. It was thus possible to differentiate the OHS risks from the risk of drop in quality in the case of the factory expansion project. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  2. 40 CFR 600.108-08 - Analytical gases.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...

  3. Automatic computer procedure for generating exact and analytical kinetic energy operators based on the polyspherical approach: General formulation and removal of singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ndong, Mamadou; Lauvergnat, David; Nauts, André

    2013-11-28

    We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less

  4. Analytical procedures for the determination of selected trace elements in peat and plant samples by inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Krachler, Michael; Mohl, Carola; Emons, Hendrik; Shotyk, William

    2002-08-01

    A simple, robust and reliable analytical procedure for the determination of 15 elements, namely Ca, V, Cr, Mn, Co, Ni, Cu, Zn, Rb, Ag, Cd, Ba, Tl, Th and U in peat and plant materials by inductively coupled plasma-quadrupole mass spectrometry (ICP-QMS) was developed. Powdered sample aliquots of approximately 220 mg were dissolved with various acid mixtures in a microwave heated high-pressure autoclave capable to digest 40 samples simultaneously. The selection of appropriate amounts of digestion acids (nitric acid, hydrofluoric acid or tetrafluoroboric acid) was crucial to obtain accurate results. The optimized acid mixture for digestion of plant and peat samples consisted of 3 ml HNO 3 and 0.1 ml HBF 4. An ultrasonic nebulizer with an additional membrane desolvation unit was found beneficial for the determination of Co, Ni, Ag, Tl, Th and U, allowing to aspirate a dry sample aerosol into the ICP-QMS. A pneumatic cross flow nebulizer served as sample introduction device for the other elements. Internal standardization was achieved with 103Rh for all elements, except for Th whose ICP-QMS signals were corrected by 103Rh and 185Re. Quality control was ascertained by analysis of the certified plant reference material GBW 07602 Bush Branches and Leaves. In almost all cases HNO 3 alone could not fully liberate the analytes of interest from the peat or plant matrix, probably because of the silicates present. After adding small amounts (0.05-0.1 ml) of either HF or HBF 4 to the digestion mixture, concentrations quantified by ICP-QMS generally increased significantly, in the case of Rb up to 80%. Further increasing the volumes of HF or HBF 4 in turn, resulted in a loss of recoveries of almost all elements, some of which amounted to approximately 60%. The successful analytical procedures were applied to the determination of two bulk peat materials. In general, good agreement between the found concentrations and results from an inter-laboratory trial or from instrumental

  5. A GRAPHICAL DIAGNOSTIC METHOD FOR ASSESSING THE ROTATION IN FACTOR ANALYTICAL MODELS OF ATMOSPHERIC POLLUTION. (R831078)

    EPA Science Inventory

    Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...

  6. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  7. Psychometric Structure of a Comprehensive Objective Structured Clinical Examination: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Volkan, Kevin; Simon, Steven R.; Baker, Harley; Todres, I. David

    2004-01-01

    Problem Statement and Background: While the psychometric properties of Objective Structured Clinical Examinations (OSCEs) have been studied, their latent structures have not been well characterized. This study examines a factor analytic model of a comprehensive OSCE and addresses implications for measurement of clinical performance. Methods: An…

  8. Latent structure of the Wisconsin Card Sorting Test: a confirmatory factor analytic study.

    PubMed

    Greve, Kevin W; Stickle, Timothy R; Love, Jeffrey M; Bianchini, Kevin J; Stanford, Matthew S

    2005-05-01

    The present study represents the first large scale confirmatory factor analysis of the Wisconsin Card Sorting Test (WCST). The results generally support the three factor solutions reported in the exploratory factor analysis literature. However, only the first factor, which reflects general executive functioning, is statistically sound. The secondary factors, while likely reflecting meaningful cognitive abilities, are less stable except when all subjects complete all 128 cards. It is likely that having two discontinuation rules for the WCST has contributed to the varied factor analytic solutions reported in the literature and early discontinuation may result in some loss of useful information. Continued multivariate research will be necessary to better clarify the processes underlying WCST performance and their relationships to one another.

  9. A Factor Analytic Study of a Scale Designed to Measure Death Anxiety.

    ERIC Educational Resources Information Center

    Thorson, James A.; Perkins, Mark

    A death anxiety scale developed in 1973 by Nehrke was administered to 655 adult subjects. Their responses were differentiated according to age, sex, race, and level of education. Data were also analyzed using the varimax rotated factor matrix procedure to determine significant factors that the scale was, in fact, measuring. Loadings on four…

  10. Request Pattern, Pre-Analytical and Analytical Conditions of Urinalysis in Primary Care: Lessons from a One-Year Large-Scale Multicenter Study.

    PubMed

    Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos

    2018-06-01

    To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.

  11. Preterm delivery risk factors in singletons born after in vitro fertilization procedures.

    PubMed

    Ban Frangez, Helena; Korosec, Sara; Verdenik, Ivan; Kotar, Vanja; Kladnik, Urska; Vrtacnik Bokal, Eda

    2014-05-01

    Women delivering singletons after in vitro fertilization (IVF) procedures have a greater risk of preterm delivery (PD). The aim of our study was to analyze PD risk factors and to identify those that could possibly be prevented. In our matched controlled study we analyzed 1127 singleton deliveries after IVF and transfer of fresh embryos performed at the University Medical Center Ljubljana between 1 January 2002 and 31 December 2010. For every delivery included in the study group we chose three consecutive controls matched by maternal age, parity and maternity hospital. The main outcome measure was PD (<37 weeks). Investigated variables were: previous PD (PPD), conization, pregestational diabetes mellitus, uterine anomaly, operation on the uterus, chronic renal disease, maternal age and parity, and body mass index (BMI). Variables investigated within the IVF group were: stimulation protocol, laboratory procedure, number of retrieved oocytes and number and quality of transferred embryos. The PD rate after IVF was 1.5 times higher than after natural conception (11.5% in the IVF group and 7.7% in the control group, p<0.001). Conization and chronic renal disease were shown to be significant risk factors for PD in both the IVF group and the naturally conceiving controls. BMI>30 was an important risk factor only in the IVF group (OR 1.86 (1.06-3.27) vs. 1.10 (0.67-1.80)) and PPD only in the controls (OR 1.83 (0.78-4.28) vs. 3.22 (1.55-6.67)). Among the investigated PD risk factors, an IVF procedure was shown to be the fifth most important one. On analyzing parameters of the ovarian stimulation and IVF procedure, no PD risk factor was identified. IVF was shown to be a significant risk factor for PD. In the IVF population, BMI plays a far more important role in PD than in the fertile population. In our research PD reoccurrence in IVF group was less than expected, which could perhaps be explained by the surgical correction of gynecological pathology and, where necessary, its

  12. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    ERIC Educational Resources Information Center

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  13. Human Factors and ISS Medical Systems: Highlights of Procedures and Equipment Findings

    NASA Technical Reports Server (NTRS)

    Byrne, V. E.; Hudy, C.; Smith, D.; Whitmore, M.

    2005-01-01

    As part of the Space Human Factors Engineering Critical Questions Roadmap, a three year Technology Development Project (TDP) was funded by NASA Headquarters to examine emergency medical procedures on ISS. The overall aim of the emergency medical procedures project was to determine the human factors issues in the procedures, training, communications and equipment, and to recommend solutions that will improve the survival rate of crewmembers in the event of a medical emergency. Currently, each ISS crew remains on orbit for six month intervals. As there is not standing requirement for a physician crewmember, during such time, the maintenance of crew health is dependant on individual crewmembers. Further, in the event of an emergency, crew will need to provide prolonged maintenance care, as well as emergency treatment, to an injured crewmember while awaiting transport to Earth. In addition to the isolation of the crew, medical procedures must be carried out within the further limitations imposed by the physical environment of the space station. For example, in order to administer care on ISS without the benefit of gravity, the Crew Medical Officers (CMOs) must restrain the equipment required to perform the task, restrain the injured crewmember, and finally, restrain themselves. Both the physical environment and the physical space available further limit the technology that can be used onboard. Equipment must be compact, yet able to withstand high levels of radiation and function without gravity. The focus here is to highlight the human factors impacts from our three year project involving the procedures and equipment areas that have been investigated and provided valuable to ISS and provide groundwork for human factors requirements for medical applications for exploration missions.

  14. The Structure of Temperament in Preschoolers: A Two-Stage Factor Analytic Approach

    PubMed Central

    Dyson, Margaret W.; Olino, Thomas M.; Durbin, C. Emily; Goldsmith, H. Hill; Klein, Daniel N.

    2012-01-01

    The structure of temperament traits in young children has been the subject of extensive debate, with separate models proposing different trait dimensions. This research has relied almost exclusively on parent-report measures. The present study used an alternative approach, a laboratory observational measure, to explore the structure of temperament in preschoolers. A 2-stage factor analytic approach, exploratory factor analyses (n = 274) followed by confirmatory factor analyses (n = 276), was used. We retrieved an adequately fitting model that consisted of 5 dimensions: Sociability, Positive Affect/Interest, Dysphoria, Fear/Inhibition, and Constraint versus Impulsivity. This solution overlaps with, but is also distinct from, the major models derived from parent-report measures. PMID:21859196

  15. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  16. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  17. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  18. Procedures For Microbial-Ecology Laboratory

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1993-01-01

    Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.

  19. A Class of Factor Analysis Estimation Procedures with Common Asymptotic Sampling Properties

    ERIC Educational Resources Information Center

    Swain, A. J.

    1975-01-01

    Considers a class of estimation procedures for the factor model. The procedures are shown to yield estimates possessing the same asymptotic sampling properties as those from estimation by maximum likelihood or generalized last squares, both special members of the class. General expressions for the derivatives needed for Newton-Raphson…

  20. Universal analytical scattering form factor for shell-, core-shell, or homogeneous particles with continuously variable density profile shape.

    PubMed

    Foster, Tobias

    2011-09-01

    A novel analytical and continuous density distribution function with a widely variable shape is reported and used to derive an analytical scattering form factor that allows us to universally describe the scattering from particles with the radial density profile of homogeneous spheres, shells, or core-shell particles. Composed by the sum of two Fermi-Dirac distribution functions, the shape of the density profile can be altered continuously from step-like via Gaussian-like or parabolic to asymptotically hyperbolic by varying a single "shape parameter", d. Using this density profile, the scattering form factor can be calculated numerically. An analytical form factor can be derived using an approximate expression for the original Fermi-Dirac distribution function. This approximation is accurate for sufficiently small rescaled shape parameters, d/R (R being the particle radius), up to values of d/R ≈ 0.1, and thus captures step-like, Gaussian-like, and parabolic as well as asymptotically hyperbolic profile shapes. It is expected that this form factor is particularly useful in a model-dependent analysis of small-angle scattering data since the applied continuous and analytical function for the particle density profile can be compared directly with the density profile extracted from the data by model-free approaches like the generalized inverse Fourier transform method. © 2011 American Chemical Society

  1. An analytical approach for the calculation of stress-intensity factors in transformation-toughened ceramics

    NASA Astrophysics Data System (ADS)

    Müller, W. H.

    1990-12-01

    Stress-induced transformation toughening in Zirconia-containing ceramics is described analytically by means of a quantitative model: A Griffith crack which interacts with a transformed, circular Zirconia inclusion. Due to its volume expansion, a ZrO2-particle compresses its flanks, whereas a particle in front of the crack opens the flanks such that the crack will be attracted and finally absorbed. Erdogan's integral equation technique is applied to calculate the dislocation functions and the stress-intensity-factors which correspond to these situations. In order to derive analytical expressions, the elastic constants of the inclusion and the matrix are assumed to be equal.

  2. Constraints on the [Formula: see text] form factor from analyticity and unitarity.

    PubMed

    Ananthanarayan, B; Caprini, I; Kubis, B

    Motivated by the discrepancies noted recently between the theoretical calculations of the electromagnetic [Formula: see text] form factor and certain experimental data, we investigate this form factor using analyticity and unitarity in a framework known as the method of unitarity bounds. We use a QCD correlator computed on the spacelike axis by operator product expansion and perturbative QCD as input, and exploit unitarity and the positivity of its spectral function, including the two-pion contribution that can be reliably calculated using high-precision data on the pion form factor. From this information, we derive upper and lower bounds on the modulus of the [Formula: see text] form factor in the elastic region. The results provide a significant check on those obtained with standard dispersion relations, confirming the existence of a disagreement with experimental data in the region around [Formula: see text].

  3. Analytical procedure for characterization of medieval wall-paintings by X-ray fluorescence spectrometry, laser ablation inductively coupled plasma mass spectrometry and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Syta, Olga; Rozum, Karol; Choińska, Marta; Zielińska, Dobrochna; Żukowska, Grażyna Zofia; Kijowska, Agnieszka; Wagner, Barbara

    2014-11-01

    Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th-14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers.

  4. Cosmetic Liposuction: Preoperative Risk Factors, Major Complication Rates, and Safety of Combined Procedures.

    PubMed

    Kaoutzanis, Christodoulos; Gupta, Varun; Winocour, Julian; Layliev, John; Ramirez, Roberto; Grotting, James C; Higdon, Kent

    2017-06-01

    Liposuction is among the most commonly performed aesthetic procedures, and is being performed increasingly as an adjunct to other procedures. To report the incidence and risk factors of significant complications after liposuction, and to determine whether adding liposuction to other cosmetic surgical procedures impacts the complication risk. A prospective cohort of patients who underwent liposuction between 2008 and 2013 was identified from the CosmetAssure database. Primary outcome was occurrence of major complications requiring emergency room visit, hospital admission, or reoperation within 30 days of the operation. Univariate and multivariate analysis evaluated risk factors including age, gender, body mass index (BMI), smoking, diabetes, type of surgical facility, and combined procedures. Of the 31,010 liposuction procedures, only 11,490 (37.1%) were performed as a solitary procedure. Liposuction alone had a major complication rate of 0.7% with hematoma (0.15%), pulmonary complications (0.1%), infection (0.1%), and confirmed venous thromboembolism (VTE) (0.06%) being the most common. Independent predictors of major complications included combined procedures (Relative Risk (RR) 4.81), age (RR 1.01), BMI (RR 1.05), and procedures performed in hospitals (RR 1.36). When examining specifically other aesthetic procedures performed alone or with liposuction, combined procedures had a higher risk of confirmed VTE (RR 5.65), pulmonary complications (RR 2.72), and infection (RR 2.41), but paradoxically lower hematoma risk (RR 0.77) than solitary procedures. Liposuction performed alone is a safe procedure with a low risk of major complications. Combined procedures, especially on obese or older individuals, can significantly increase complication rates. The impact of liposuction on the risk of hematoma in combined procedures needs further investigation. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  5. Development of analytical procedures for the determination of hexavalent chromium in corrosion prevention coatings used in the automotive industry.

    PubMed

    Séby, F; Castetbon, A; Ortega, R; Guimon, C; Niveau, F; Barrois-Oudin, N; Garraud, H; Donard, O F X

    2008-05-01

    The European directive 2000/53/EC limits the use of Cr(VI) in vehicle manufacturing. Although a maximum of 2 g of Cr(VI) was authorised per vehicle for corrosion prevention coatings of key components, since July 2007 its use has been prohibited except for some particular applications. Therefore, the objective of this work was to develop direct analytical procedures for Cr(VI) determination in the different steel coatings used for screws. Instead of working directly with screws, the optimisation of the procedures was carried out with metallic plates homogeneously coated to improve the data comparability. Extraction of Cr(VI) from the metallic parts was performed by sonication. Two extraction solutions were tested: a direct water extraction solution used in standard protocols and an ammonium/ammonia buffer solution at pH 8.9. The extracts were further analysed for Cr speciation by high-performance liquid chromatography (HPLC) inductively coupled plasma (ICP) atomic emission spectrometry or HPLC ICP mass spectrometry depending on the concentration level. When possible, the coatings were also directly analysed by solid speciation techniques (X-ray photoelectron spectroscopy, XPS, and X-ray absorption near-edge structure, XANES) for validation of the results. Very good results between the different analytical approaches were obtained for the sample of coating made up of a heated paint containing Zn, Al and Cr when using the extracting buffer solution at pH 8.9. After a repeated four-step extraction procedure on the same portion test, taking into account the depth of the surface layer reached, good agreement with XPS and XANES results was obtained. In contrast, for the coatings composed of an alkaline Zn layer where Cr(VI) and Cr(III) are deposited, only the extraction procedure using water allowed the detection of Cr(VI). To elucidate the Cr(VI) reduction during extraction at pH 8.9, the reactivity of Cr(VI) towards different species of Zn generally present in the

  6. Analytical procedure for the determination of Ethyl Lauroyl Arginate (LAE) to assess the kinetics and specific migration from a new antimicrobial active food packaging.

    PubMed

    Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina

    2012-10-01

    Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    PubMed

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  8. Recent trends in analytical procedures in forensic toxicology.

    PubMed

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  9. A confirmatory factor analytic validation of the Tinnitus Handicap Inventory.

    PubMed

    Kleinstäuber, Maria; Frank, Ina; Weise, Cornelia

    2015-03-01

    Because the postulated three-factor structure of the internationally widely used Tinnitus Handicap Inventory (THI) has not been confirmed yet by a confirmatory factor analytic approach this was the central aim of the current study. From a clinical setting, N=373 patients with chronic tinnitus completed the THI and further questionnaires assessing tinnitus-related and psychological variables. In order to analyze the psychometric properties of the THI, confirmatory factor analysis (CFA) and correlational analyses were conducted. CFA provided a statistically significant support for a better fit of the data to the hypothesized three-factor structure (RMSEA=.049, WRMR=1.062, CFI=.965, TLI=.961) than to a general factor model (RMSEA=.062, WRMR=1.258, CFI=.942, TLI=.937). The calculation of Cronbach's alpha as indicator of internal consistency revealed satisfactory values (.80-.91) with the exception of the catastrophic subscale (.65). High positive correlations of the THI and its subscales with other measures of tinnitus distress, anxiety, and depression, high negative correlations with tinnitus acceptance, moderate positive correlations with anxiety sensitivity, sleeping difficulties, tinnitus loudness, and small correlations with the Big Five personality dimensions confirmed construct validity. Results show that the THI is a highly reliable and valid measure of tinnitus-related handicap. In contrast to results of previous exploratory analyses the current findings speak for a three-factor in contrast to a unifactorial structure. Future research is needed to replicate this result in different tinnitus populations. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  11. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  12. Systematic Review: Predisposing, Precipitating, Perpetuating, and Present Factors Predicting Anticipatory Distress to Painful Medical Procedures in Children

    PubMed Central

    Pillai Riddell, Rebecca R.; Khan, Maria; Calic, Masa; Taddio, Anna; Tablon, Paula

    2016-01-01

    Objective To conduct a systematic review of the factors predicting anticipatory distress to painful medical procedures in children. Methods A systematic search was conducted to identify studies with factors related to anticipatory distress to painful medical procedures in children aged 0–18 years. The search retrieved 7,088 articles to review against inclusion criteria. A total of 77 studies were included in the review. Results 31 factors were found to predict anticipatory distress to painful medical procedures in children. A narrative synthesis of the evidence was conducted, and a summary figure is presented. Conclusions Many factors were elucidated that contribute to the occurrence of anticipatory distress to painful medical procedures. The factors that appear to increase anticipatory distress are child psychopathology, difficult child temperament, parent distress promoting behaviors, parent situational distress, previous pain events, parent anticipation of distress, and parent anxious predisposition. Longitudinal and experimental research is needed to further elucidate these factors. PMID:26338981

  13. Flow cytometry for feline lymphoma: a retrospective study regarding pre-analytical factors possibly affecting the quality of samples.

    PubMed

    Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano

    2018-06-01

    Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for

  14. Clean Water Act Analytical Methods

    EPA Pesticide Factsheets

    EPA publishes laboratory analytical methods (test procedures) that are used by industries and municipalities to analyze the chemical, physical and biological components of wastewater and other environmental samples required by the Clean Water Act.

  15. Laboratory Workhorse: The Analytical Balance.

    ERIC Educational Resources Information Center

    Clark, Douglas W.

    1979-01-01

    This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)

  16. Stability of Q-Factors across Two Data Collection Methods.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    The purpose of the present study was to determine how two different data collection techniques would affect the Q-factors derived from several factor analytic procedures. Faculty members (N=146) from seven middle schools responded to 61 items taken from an instrument designed to measure aspects of an idealized middle school culture; the instrument…

  17. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  18. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models.

    PubMed

    Sae-Lim, Panya; Komen, Hans; Kause, Antti; Mulder, Han A

    2014-02-26

    Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Day*Degree and photoperiod were identified as environmental

  20. Identifying environmental variables explaining genotype-by-environment interaction for body weight of rainbow trout (Onchorynchus mykiss): reaction norm and factor analytic models

    PubMed Central

    2014-01-01

    Background Identifying the relevant environmental variables that cause GxE interaction is often difficult when they cannot be experimentally manipulated. Two statistical approaches can be applied to address this question. When data on candidate environmental variables are available, GxE interaction can be quantified as a function of specific environmental variables using a reaction norm model. Alternatively, a factor analytic model can be used to identify the latent common factor that explains GxE interaction. This factor can be correlated with known environmental variables to identify those that are relevant. Previously, we reported a significant GxE interaction for body weight at harvest in rainbow trout reared on three continents. Here we explore their possible causes. Methods Reaction norm and factor analytic models were used to identify which environmental variables (age at harvest, water temperature, oxygen, and photoperiod) may have caused the observed GxE interaction. Data on body weight at harvest was recorded on 8976 offspring reared in various locations: (1) a breeding environment in the USA (nucleus), (2) a recirculating aquaculture system in the Freshwater Institute in West Virginia, USA, (3) a high-altitude farm in Peru, and (4) a low-water temperature farm in Germany. Akaike and Bayesian information criteria were used to compare models. Results The combination of days to harvest multiplied with daily temperature (Day*Degree) and photoperiod were identified by the reaction norm model as the environmental variables responsible for the GxE interaction. The latent common factor that was identified by the factor analytic model showed the highest correlation with Day*Degree. Day*Degree and photoperiod were the environmental variables that differed most between Peru and other environments. Akaike and Bayesian information criteria indicated that the factor analytical model was more parsimonious than the reaction norm model. Conclusions Day*Degree and

  1. Use of fractional factorial design for optimization of digestion procedures followed by multi-element determination of essential and non-essential elements in nuts using ICP-OES technique.

    PubMed

    Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A

    2007-01-15

    Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.

  2. Factors Influencing Employee Learning in Small Businesses

    ERIC Educational Resources Information Center

    Coetzer, Alan; Perry, Martin

    2008-01-01

    Purpose: The purpose of this research is to identify key factors influencing employee learning from the perspective of owners/managers. Design/methodology/research: Data were gathered from owners/managers in a total of 27 small manufacturing and services firms through interviews and analysed using content analytic procedures. Findings: The…

  3. An investigation of several factors involved in a finite difference procedure for analyzing the transonic flow about harmonically oscillating airfoils and wings

    NASA Technical Reports Server (NTRS)

    Ehlers, F. E.; Sebastian, J. D.; Weatherill, W. H.

    1979-01-01

    Analytical and empirical studies of a finite difference method for the solution of the transonic flow about harmonically oscillating wings and airfoils are presented. The procedure is based on separating the velocity potential into steady and unsteady parts and linearizing the resulting unsteady equations for small disturbances. Since sinusoidal motion is assumed, the unsteady equation is independent of time. Three finite difference investigations are discussed including a new operator for mesh points with supersonic flow, the effects on relaxation solution convergence of adding a viscosity term to the original differential equation, and an alternate and relatively simple downstream boundary condition. A method is developed which uses a finite difference procedure over a limited inner region and an approximate analytical procedure for the remaining outer region. Two investigations concerned with three-dimensional flow are presented. The first is the development of an oblique coordinate system for swept and tapered wings. The second derives the additional terms required to make row relaxation solutions converge when mixed flow is present. A finite span flutter analysis procedure is described using the two-dimensional unsteady transonic program with a full three-dimensional steady velocity potential.

  4. Systematic Review: Predisposing, Precipitating, Perpetuating, and Present Factors Predicting Anticipatory Distress to Painful Medical Procedures in Children.

    PubMed

    Racine, Nicole M; Riddell, Rebecca R Pillai; Khan, Maria; Calic, Masa; Taddio, Anna; Tablon, Paula

    2016-03-01

    To conduct a systematic review of the factors predicting anticipatory distress to painful medical procedures in children. A systematic search was conducted to identify studies with factors related to anticipatory distress to painful medical procedures in children aged 0-18 years. The search retrieved 7,088 articles to review against inclusion criteria. A total of 77 studies were included in the review. 31 factors were found to predict anticipatory distress to painful medical procedures in children. A narrative synthesis of the evidence was conducted, and a summary figure is presented. Many factors were elucidated that contribute to the occurrence of anticipatory distress to painful medical procedures. The factors that appear to increase anticipatory distress are child psychopathology, difficult child temperament, parent distress promoting behaviors, parent situational distress, previous pain events, parent anticipation of distress, and parent anxious predisposition. Longitudinal and experimental research is needed to further elucidate these factors. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Analytic tests and their relation to jet fuel thermal stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heneghan, S.P.; Kauffman, R.E.

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less

  6. Sensitivity of the diagnostic radiological index of protection to procedural factors in fluoroscopy.

    PubMed

    Jones, A Kyle; Pasciak, Alexander S; Wagner, Louis K

    2016-07-01

    To evaluate the sensitivity of the diagnostic radiological index of protection (DRIP), used to quantify the protective value of radioprotective garments, to procedural factors in fluoroscopy in an effort to determine an appropriate set of scatter-mimicking primary beams to be used in measuring the DRIP. Monte Carlo simulations were performed to determine the shape of the scattered x-ray spectra incident on the operator in different clinical fluoroscopy scenarios, including interventional radiology and interventional cardiology (IC). Two clinical simulations studied the sensitivity of the scattered spectrum to gantry angle and patient size, while technical factors were varied according to measured automatic dose rate control (ADRC) data. Factorial simulations studied the sensitivity of the scattered spectrum to gantry angle, field of view, patient size, and beam quality for constant technical factors. Average energy (Eavg) was the figure of merit used to condense fluence in each energy bin to a single numerical index. Beam quality had the strongest influence on the scattered spectrum in fluoroscopy. Many procedural factors affect the scattered spectrum indirectly through their effect on primary beam quality through ADRC, e.g., gantry angle and patient size. Lateral C-arm rotation, common in IC, increased the energy of the scattered spectrum, regardless of the direction of rotation. The effect of patient size on scattered radiation depended on ADRC characteristics, patient size, and procedure type. The scattered spectrum striking the operator in fluoroscopy is most strongly influenced by primary beam quality, particularly kV. Use cases for protective garments should be classified by typical procedural primary beam qualities, which are governed by the ADRC according to the impacts of patient size, anatomical location, and gantry angle.

  7. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  8. Defining dignity in terminally ill cancer patients: a factor-analytic approach.

    PubMed

    Hack, Thomas F; Chochinov, Harvey Max; Hassard, Thomas; Kristjanson, Linda J; McClement, Susan; Harlos, Mike

    2004-10-01

    The construct of 'dignity' is frequently raised in discussions about quality end of life care for terminal cancer patients, and is invoked by parties on both sides of the euthanasia debate. Lacking in this general debate has been an empirical explication of 'dignity' from the viewpoint of cancer patients themselves. The purpose of the present study was to use factor-analytic and regression methods to analyze dignity data gathered from 213 cancer patients having less than 6 months to live. Patients rated their sense of dignity, and completed measures of symptom distress and psychological well-being. The results showed that although the majority of patients had an intact sense of dignity, there were 99 (46%) patients who reported at least some, or occasional loss of dignity, and 16 (7.5%) patients who indicated that loss of dignity was a significant problem. The exploratory factor analysis yielded six primary factors: (1) Pain; (2) Intimate Dependency; (3) Hopelessness/Depression; (4) Informal Support Network; (5) Formal Support Network; and (6) Quality of Life. Subsequent regression analyses of modifiable factors produced a final two-factor (Hopelessness/Depression and Intimate Dependency) model of statistical significance. These results provide empirical support for the dignity model, and suggest that the provision of end of life care should include methods for treating depression, fostering hope, and facilitating functional independence. Copyright 2004 John Wiley & Sons, Ltd.

  9. Analytical study of comet nucleus samples

    NASA Technical Reports Server (NTRS)

    Albee, A. L.

    1989-01-01

    Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.

  10. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  11. Taxometric and Factor Analytic Models of Anxiety Sensitivity among Youth: Exploring the Latent Structure of Anxiety Psychopathology Vulnerability

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Stewart, Sherry; Comeau, Nancy

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), a well-established affect-sensitivity individual difference factor, among youth by employing taxometric and factor analytic approaches in an integrative manner. Taxometric analyses indicated that AS, as indexed by the Child Anxiety Sensitivity…

  12. A new method for constructing analytic elements for groundwater flow.

    NASA Astrophysics Data System (ADS)

    Strack, O. D.

    2007-12-01

    The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.

  13. Factors Affecting the Location of Road Emergency Bases in Iran Using Analytical Hierarchy Process (AHP).

    PubMed

    Bahadori, Mohammadkarim; Hajebrahimi, Ahmad; Alimohammadzadeh, Khalil; Ravangard, Ramin; Hosseini, Seyed Mojtaba

    2017-10-01

    To identify and prioritize factors affecting the location of road emergency bases in Iran using Analytical Hierarchy Process (AHP). This was a mixed method (quantitative-qualitative) study conducted in 2016. The participants in this study included the professionals and experts in the field of pre-hospital and road emergency services issues working in the Health Deputy of Iran Ministry of Health and Medical Education, which were selected using purposive sampling method. In this study at first, the factors affecting the location of road emergency bases in Iran were identified using literature review and conducting interviews with the experts. Then, the identified factors were scored and prioritized using the studied professionals and experts' viewpoints through using the analytic hierarchy process (AHP) technique and its related pair-wise questionnaire. The collected data were analyzed using MAXQDA 10.0 software to analyze the answers given to the open question and Expert Choice 10.0 software to determine the weights and priorities of the identified factors. The results showed that eight factors were effective in locating the road emergency bases in Iran from the viewpoints of the studied professionals and experts in the field of pre-hospital and road emergency services issues, including respectively distance from the next base, region population, topography and geographical situation of the region, the volume of road traffic, the existence of amenities such as water, electricity, gas, etc. and proximity to the village, accident-prone sites, University ownership of the base site, and proximity to toll-house. Among the eight factors which were effective in locating the road emergency bases from the studied professionals and experts' perspectives, "distance from the next base" and "region population" were respectively the most important ones which had great differences with other factors.

  14. Determination of Slope Safety Factor with Analytical Solution and Searching Critical Slip Surface with Genetic-Traversal Random Method

    PubMed Central

    2014-01-01

    In the current practice, to determine the safety factor of a slope with two-dimensional circular potential failure surface, one of the searching methods for the critical slip surface is Genetic Algorithm (GA), while the method to calculate the slope safety factor is Fellenius' slices method. However GA needs to be validated with more numeric tests, while Fellenius' slices method is just an approximate method like finite element method. This paper proposed a new method to determine the minimum slope safety factor which is the determination of slope safety factor with analytical solution and searching critical slip surface with Genetic-Traversal Random Method. The analytical solution is more accurate than Fellenius' slices method. The Genetic-Traversal Random Method uses random pick to utilize mutation. A computer automatic search program is developed for the Genetic-Traversal Random Method. After comparison with other methods like slope/w software, results indicate that the Genetic-Traversal Random Search Method can give very low safety factor which is about half of the other methods. However the obtained minimum safety factor with Genetic-Traversal Random Search Method is very close to the lower bound solutions of slope safety factor given by the Ansys software. PMID:24782679

  15. Current projects in Pre-analytics: where to go?

    PubMed

    Sapino, Anna; Annaratone, Laura; Marchiò, Caterina

    2015-01-01

    The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.

  16. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  17. Management of thyroid cytological material, pre-analytical procedures and bio-banking.

    PubMed

    Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo

    2018-06-09

    Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  18. Analytic modeling of aerosol size distributions

    NASA Technical Reports Server (NTRS)

    Deepack, A.; Box, G. P.

    1979-01-01

    Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.

  19. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  20. Selecting Statistical Quality Control Procedures for Limiting the Impact of Increases in Analytical Random Error on Patient Safety.

    PubMed

    Yago, Martín

    2017-05-01

    QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.

  1. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Effective Work Procedure design Using Discomfort and Effort Factor in Brick stacking operation-A case study

    NASA Astrophysics Data System (ADS)

    Rout, Biswaranjan; Dash, R. R.; Dhupal, D.

    2018-02-01

    In this work a typical planning of movement of limbs and torso of the worker to be well design to reduce fatigue and energy of the worker. A simulation model is generated to suit the procedure and comply with the constraints in the workspace. It requires verifying the capability of human postures and movements in different working conditions for the evaluation of effectiveness of the new design. In this article a simple human performance measure is introduce that enable the mathematical model for evaluation of a cost function. The basic scheme is to evaluate the performance in the form of several cost factors using AI techniques. Here two main cost factors taken in to consideration are discomfort factor and effort factor in limb movements. Discomfort factor measures the level of discomfort from the most neutral position of a given limb to the position of the corresponding limb after movement and effort factor is a measure of the displacement of the corresponding limbs from the original position. The basic aim is to optimize the movement of the limbs with the above mentioned cost functions. The effectiveness of the procedure is tested with an example of working procedure of workers used for stacking of fly ash bricks in a local fly ash bricks manufacturing unit. The objective is to find out the optimised movement of the limbs to reduce discomfort level and effort required of workers. The effectiveness of the procedure in this case study illustrated with the obtained results.

  3. Discordance between net analyte signal theory and practical multivariate calibration.

    PubMed

    Brown, Christopher D

    2004-08-01

    Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.

  4. Analytic thinking promotes religious disbelief.

    PubMed

    Gervais, Will M; Norenzayan, Ara

    2012-04-27

    Scientific interest in the cognitive underpinnings of religious belief has grown in recent years. However, to date, little experimental research has focused on the cognitive processes that may promote religious disbelief. The present studies apply a dual-process model of cognitive processing to this problem, testing the hypothesis that analytic processing promotes religious disbelief. Individual differences in the tendency to analytically override initially flawed intuitions in reasoning were associated with increased religious disbelief. Four additional experiments provided evidence of causation, as subtle manipulations known to trigger analytic processing also encouraged religious disbelief. Combined, these studies indicate that analytic processing is one factor (presumably among several) that promotes religious disbelief. Although these findings do not speak directly to conversations about the inherent rationality, value, or truth of religious beliefs, they illuminate one cognitive factor that may influence such discussions.

  5. Evaluation of the risk factors associated with rectal neuroendocrine tumors: a big data analytic study from a health screening center.

    PubMed

    Pyo, Jeung Hui; Hong, Sung Noh; Min, Byung-Hoon; Lee, Jun Haeng; Chang, Dong Kyung; Rhee, Poong-Lyul; Kim, Jae Jun; Choi, Sun Kyu; Jung, Sin-Ho; Son, Hee Jung; Kim, Young-Ho

    2016-12-01

    Rectal neuroendocrine tumor (NET) is the most common NET in Asia. The risk factors associated with rectal NETs are unclear because of the overall low incidence rate of these tumors and the associated difficulty in conducting large epidemiological studies on rare cases. The aim of this study was to exploit the benefits of big data analytics to assess the risk factors associated with rectal NET. A retrospective case-control study was conducted, including 102 patients with histologically confirmed rectal NETs and 52,583 healthy controls who underwent screening colonoscopy at the Center for Health Promotion of the Samsung Medical Center in Korea between January 2002 and December 2012. Information on different risk factors was collected and logistic regression analysis applied to identify predictive factors. Four factors were significantly associated with rectal NET: higher levels of cholesterol [odds ratio (OR) = 1.007, 95 % confidence interval (CI), 1.001-1.013, p = 0.016] and ferritin (OR = 1.502, 95 % CI, 1.167-1.935, p = 0.002), presence of metabolic syndrome (OR = 1.768, 95 % CI, 1.071-2.918, p = 0.026), and family history of cancer among first-degree relatives (OR = 1.664, 95 % CI, 1.019-2.718, p = 0.042). The findings of our study demonstrate the benefits of using big data analytics for research and clinical risk factor studies. Specifically, in this study, this analytical method was applied to identify higher levels of serum cholesterol and ferritin, metabolic syndrome, and family history of cancer as factors that may explain the increasing incidence and prevalence of rectal NET.

  6. Monte Carlo investigation of backscatter factors for skin dose determination in interventional neuroradiology procedures

    NASA Astrophysics Data System (ADS)

    Omar, Artur; Benmakhlouf, Hamza; Marteinsdottir, Maria; Bujila, Robert; Nowik, Patrik; Andreo, Pedro

    2014-03-01

    Complex interventional and diagnostic x-ray angiographic (XA) procedures may yield patient skin doses exceeding the threshold for radiation induced skin injuries. Skin dose is conventionally determined by converting the incident air kerma free-in-air into entrance surface air kerma, a process that requires the use of backscatter factors. Subsequently, the entrance surface air kerma is converted into skin kerma using mass energy-absorption coefficient ratios tissue-to-air, which for the photon energies used in XA is identical to the skin dose. The purpose of this work was to investigate how the cranial bone affects backscatter factors for the dosimetry of interventional neuroradiology procedures. The PENELOPE Monte Carlo system was used to calculate backscatter factors at the entrance surface of a spherical and a cubic water phantom that includes a cranial bone layer. The simulations were performed for different clinical x-ray spectra, field sizes, and thicknesses of the bone layer. The results show a reduction of up to 15% when a cranial bone layer is included in the simulations, compared with conventional backscatter factors calculated for a homogeneous water phantom. The reduction increases for thicker bone layers, softer incident beam qualities, and larger field sizes, indicating that, due to the increased photoelectric crosssection of cranial bone compared to water, the bone layer acts primarily as an absorber of low-energy photons. For neurointerventional radiology procedures, backscatter factors calculated at the entrance surface of a water phantom containing a cranial bone layer increase the accuracy of the skin dose determination.

  7. Guided-inquiry laboratory experiments to improve students' analytical thinking skills

    NASA Astrophysics Data System (ADS)

    Wahyuni, Tutik S.; Analita, Rizki N.

    2017-12-01

    This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.

  8. An analytical design procedure for the determination of effective leading edge extensions on thick delta wings

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Chaturvedi, S. K.

    1984-01-01

    An analytical design procedure for leading edge extensions (LEE) was developed for thick delta wings. This LEE device is designed to be mounted to a wing along the pseudo-stagnation stream surface associated with the attached flow design lift coefficient of greater than zero. The intended purpose of this device is to improve the aerodynamic performance of high subsonic and low supersonic aircraft at incidences above that of attached flow design lift coefficient, by using a vortex system emanating along the leading edges of the device. The low pressure associated with these vortices would act on the LEE upper surface and the forward facing area at the wing leading edges, providing an additional lift and effective leading edge thrust recovery. The first application of this technique was to a thick, round edged, twisted and cambered wing of approximately triangular planform having a sweep of 58 deg and aspect ratio of 2.30. The panel aerodynamics and vortex lattice method with suction analogy computer codes were employed to determine the pseudo-stagnation stream surface and an optimized LEE planform shape.

  9. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    PubMed

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  10. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence

  11. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  12. Sample Collection Procedures and Strategies

    EPA Pesticide Factsheets

    Individuals responsible for collecting environmental and building material samples following a contamination incident, can use these procedures to plan for and/or collect samples for analysis using the analytical methods listed in EPA's SAM

  13. Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains

    DOE PAGES

    Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John

    2015-08-18

    This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less

  14. Control Chart on Semi Analytical Weighting

    NASA Astrophysics Data System (ADS)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  15. Systematic investigation of ion suppression and enhancement effects of fourteen stable-isotope-labeled internal standards by their native analogues using atmospheric-pressure chemical ionization and electrospray ionization and the relevance for multi-analyte liquid chromatographic/mass spectrometric procedures.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Meyer, Markus R; Maurer, Hans H

    2010-04-15

    In clinical and forensic toxicology, multi-analyte procedures are very useful to quantify drugs and poisons of different classes in one run. For liquid chromatographic/tandem mass spectrometric (LC/MS/MS) multi-analyte procedures, often only a limited number of stable-isotope-labeled internal standards (SIL-ISs) are available. If an SIL-IS is used for quantification of other analytes, it must be excluded that the co-eluting native analyte influences its ionization. Therefore, the effect of ion suppression and enhancement of fourteen SIL-ISs caused by their native analogues has been studied. It could be shown that the native analyte concentration influenced the extent of ion suppression and enhancement effects leading to more suppression with increasing analyte concentration especially when electrospray ionization (ESI) was used. Using atmospheric-pressure chemical ionization (APCI), methanolic solution showed mainly enhancement effects, whereas no ion suppression and enhancement effect, with one exception, occurred when plasma extracts were used under these conditions. Such differences were not observed using ESI. With ESI, eleven SIL-ISs showed relevant suppression effects, but only one analyte showed suppression effects when APCI was used. The presented study showed that ion suppression and enhancement tests using matrix-based samples of different sources are essential for the selection of ISs, particularly if used for several analytes to avoid incorrect quantification. In conclusion, only SIL-ISs should be selected for which no suppression and enhancement effects can be observed. If not enough ISs are free of ionization interferences, a different ionization technique should be considered. 2010 John Wiley & Sons, Ltd.

  16. The analytical solution for drug delivery system with nonhomogeneous moving boundary condition

    NASA Astrophysics Data System (ADS)

    Saudi, Muhamad Hakimi; Mahali, Shalela Mohd; Harun, Fatimah Noor

    2017-08-01

    This paper discusses the development and the analytical solution of a mathematical model based on drug release system from a swelling delivery device. The mathematical model is represented by a one-dimensional advection-diffusion equation with nonhomogeneous moving boundary condition. The solution procedures consist of three major steps. Firstly, the application of steady state solution method, which is used to transform the nonhomogeneous moving boundary condition to homogeneous boundary condition. Secondly, the application of the Landau transformation technique that gives a significant impact in removing the advection term in the system of equation and transforming the moving boundary condition to a fixed boundary condition. Thirdly, the used of separation of variables method to find the analytical solution for the resulted initial boundary value problem. The results show that the swelling rate of delivery device and drug release rate is influenced by value of growth factor r.

  17. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  18. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  19. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  20. Sensitivity of fish density estimates to standard analytical procedures applied to Great Lakes hydroacoustic data

    USGS Publications Warehouse

    Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.

    2013-01-01

    Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.

  1. Pressure Ulcer Prevalence and Risk Factors among Prolonged Surgical Procedures in the OR

    PubMed Central

    Primiano, Mike; Friend, Michael; McClure, Connie; Nardi, Scott; Fix, Lisa; Schafer, Marianne; Savochka, Kathlyn; McNett, Molly

    2015-01-01

    Pressure ulcer formation related to positioning in the OR increases length of hospital stay and hospital costs, but there is little evidence documenting how positioning devices used in the OR influence pressure ulcer development when examined with traditional risk factors. The aim of this prospective cohort study was to identify prevalence of and risk factors associated with pressure ulcer development among patients undergoing surgical procedures lasting longer than three hours. Participants included all adult same-day admit patients scheduled for a three-hour surgical procedure during an eight-month period (N = 258). Data were gathered preoperatively, intraoperatively, and postoperatively on pressure ulcer risk factors. Bivariate analyses indicated that the type of positioning (ie, heels elevated) (χ2 = 7.897, P = .048), OR bed surface (ie, foam table pad) (χ2 15.848, P = .000), skin assessment in the postanesthesia care unit (χ2 = 41.652, P = .000), and male gender (χ2 = 6.984, P = .030) were associated with pressure ulcer development. Logistic regression analyses indicated that use of foam pad (B = 2.691, P = .024) and a lower day-one Braden score (B = .244, P = .003) were predictive of pressure ulcers. PMID:22118201

  2. ADRA2B Deletion Variant and Enhanced Cognitive Processing of Emotional Information: A Meta-Analytical Review.

    PubMed

    Xie, Weizhen; Cappiello, Marcus; Meng, Ming; Rosenthal, Robert; Zhang, Weiwei

    2018-05-08

    This meta-analytical review examines whether a deletion variant in ADRA2B, a gene that encodes α 2B adrenoceptor in the regulation of norepinephrine availability, influences cognitive processing of emotional information in human observers. Using a multilevel modeling approach, this meta-analysis of 16 published studies with a total of 2,752 participants showed that ADRA2B deletion variant was significantly associated with enhanced perceptual and cognitive task performance for emotional stimuli. In contrast, this genetic effect did not manifest in overall task performance when non-emotional content was used. Furthermore, various study-level factors, such as targeted cognitive processes (memory vs. attention/perception) and task procedures (recall vs. recognition), could moderate the size of this genetic effect. Overall, with increased statistical power and standardized analytical procedures, this meta-analysis has established the contributions of ADRA2B to the interactions between emotion and cognition, adding to the growing literature on individual differences in attention, perception, and memory for emotional information in the general population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry

    ERIC Educational Resources Information Center

    Adami, Gianpiero

    2006-01-01

    A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…

  4. Human factors/ergonomics implications of big data analytics: Chartered Institute of Ergonomics and Human Factors annual lecture.

    PubMed

    Drury, Colin G

    2015-01-01

    In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.

  5. ANALYTICAL PROCEDURES FOR CHARACTERIZING UNREGULATED EMISSIONS FROM VEHICLES USING MIDDLE-DISTILLATE FUELS

    EPA Science Inventory

    This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...

  6. Replica Analysis for Portfolio Optimization with Single-Factor Model

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  7. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    PubMed

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  8. FDA Bacteriological Analytical Manual, Chapter 10, 2003: Listeria monocytogenes

    EPA Pesticide Factsheets

    FDA Bacteriological Analytical Manual, Chapter 10 describes procedures for analysis of food samples and may be adapted for assessment of solid, particulate, aerosol, liquid and water samples containing Listeria monocytogenes.

  9. Main clinical, therapeutic and technical factors related to patient's maximum skin dose in interventional cardiology procedures

    PubMed Central

    Journy, N; Sinno-Tellier, S; Maccia, C; Le Tertre, A; Pirard, P; Pagès, P; Eilstein, D; Donadieu, J; Bar, O

    2012-01-01

    Objective The study aimed to characterise the factors related to the X-ray dose delivered to the patient's skin during interventional cardiology procedures. Methods We studied 177 coronary angiographies (CAs) and/or percutaneous transluminal coronary angioplasties (PTCAs) carried out in a French clinic on the same radiography table. The clinical and therapeutic characteristics, and the technical parameters of the procedures, were collected. The dose area product (DAP) and the maximum skin dose (MSD) were measured by an ionisation chamber (Diamentor; Philips, Amsterdam, The Netherlands) and radiosensitive film (Gafchromic; International Specialty Products Advanced Materials Group, Wayne, NJ). Multivariate analyses were used to assess the effects of the factors of interest on dose. Results The mean MSD and DAP were respectively 389 mGy and 65 Gy cm−2 for CAs, and 916 mGy and 69 Gy cm−2 for PTCAs. For 8% of the procedures, the MSD exceeded 2 Gy. Although a linear relationship between the MSD and the DAP was observed for CAs (r=0.93), a simple extrapolation of such a model to PTCAs would lead to an inadequate assessment of the risk, especially for the highest dose values. For PTCAs, the body mass index, the therapeutic complexity, the fluoroscopy time and the number of cine frames were independent explanatory factors of the MSD, whoever the practitioner was. Moreover, the effect of technical factors such as collimation, cinematography settings and X-ray tube orientations on the DAP was shown. Conclusion Optimising the technical options for interventional procedures and training staff on radiation protection might notably reduce the dose and ultimately avoid patient skin lesions. PMID:22457404

  10. Resilience: A Meta-Analytic Approach

    ERIC Educational Resources Information Center

    Lee, Ji Hee; Nam, Suk Kyung; Kim, A-Reum; Kim, Boram; Lee, Min Young; Lee, Sang Min

    2013-01-01

    This study investigated the relationship between psychological resilience and its relevant variables by using a meta-analytic method. The results indicated that the largest effect on resilience was found to stem from the protective factors, a medium effect from risk factors, and the smallest effect from demographic factors. (Contains 4 tables.)

  11. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  12. Using Analytic Hierarchy Process in Textbook Evaluation

    ERIC Educational Resources Information Center

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  13. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    PubMed

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Microbial ecology laboratory procedures manual NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  15. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Analytic H I-to-H2 Photodissociation Transition Profiles

    NASA Astrophysics Data System (ADS)

    Bialy, Shmuel; Sternberg, Amiel

    2016-05-01

    We present a simple analytic procedure for generating atomic (H I) to molecular ({{{H}}}2) density profiles for optically thick hydrogen gas clouds illuminated by far-ultraviolet radiation fields. Our procedure is based on the analytic theory for the structure of one-dimensional H I/{{{H}}}2 photon-dominated regions, presented by Sternberg et al. Depth-dependent atomic and molecular density fractions may be computed for arbitrary gas density, far-ultraviolet field intensity, and the metallicity-dependent H2 formation rate coefficient, and dust absorption cross section in the Lyman-Werner photodissociation band. We use our procedure to generate a set of {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition profiles for a wide range of conditions, from the weak- to strong-field limits, and from super-solar down to low metallicities. We show that if presented as functions of dust optical depth, the {{H}} {{I}} and {{{H}}}2 density profiles depend primarily on the Sternberg “α G parameter” (dimensionless) that determines the dust optical depth associated with the total photodissociated {{H}} {{I}} column. We derive a universal analytic formula for the {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition points as a function of just α G. Our formula will be useful for interpreting emission-line observations of H I/{{{H}}}2 interfaces, for estimating star formation thresholds, and for sub-grid components in hydrodynamics simulations.

  17. The Case for Adopting Server-side Analytics

    NASA Astrophysics Data System (ADS)

    Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.

    2017-12-01

    The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for

  18. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  19. Quantifying the measurement uncertainty of results from environmental analytical methods.

    PubMed

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  20. Problematic eating behaviors among bariatric surgical candidates: a psychometric investigation and factor analytic approach.

    PubMed

    Gelinas, Bethany L; Delparte, Chelsea A; Wright, Kristi D; Hart, Regan

    2015-01-01

    Psychological factors (e.g., anxiety, depression) are routinely assessed in bariatric pre-surgical programs, as high levels of psychopathology are consistently related to poor program outcomes (e.g., failure to lose significant weight pre-surgery, weight regain post-surgery). Behavioral factors related to poor program outcomes and ways in which behavioral and psychological factors interact, have received little attention in bariatric research and practice. Potentially problematic behavioral factors are queried by Section H of the Weight and Lifestyle Inventory (WALI-H), in which respondents indicate the relevance of certain eating behaviors to obesity. A factor analytic investigation of the WALI-H serves to improve the way in which this assessment tool is interpreted and used among bariatric surgical candidates, and subsequent moderation analyses serve to demonstrate potential compounding influences of psychopathology on eating behavior factors. Bariatric surgical candidates (n =362) completed several measures of psychopathology and the WALI-H. Item responses from the WALI-H were subjected to principal axis factoring with oblique rotation. Results revealed a three-factor model including: (1) eating in response to negative affect, (2) overeating/desirability of food, and (3) eating in response to positive affect/social cues. All three behavioral factors of the WALI-H were significantly associated with measures of depression and anxiety. Moderation analyses revealed that depression did not moderate the relationship between anxiety and any eating behavior factor. Although single forms of psychopathology are related to eating behaviors, the combination of psychopathology does not appear to influence these problematic behaviors. Recommendations for pre-surgical assessment and treatment of bariatric surgical candidates are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  2. 37Cl/35Cl isotope ratio analysis in perchlorate by ion chromatography/multi collector -ICPMS: Analytical performance and implication for biodegradation studies.

    PubMed

    Zakon, Yevgeni; Ronen, Zeev; Halicz, Ludwik; Gelman, Faina

    2017-10-01

    In the present study we propose a new analytical method for 37 Cl/ 35 Cl analysis in perchlorate by Ion Chromatography(IC) coupled to Multicollector Inductively Coupled Plasma Mass Spectrometry (MC-ICPMS). The accuracy of the analytical method was validated by analysis of international perchlorate standard materials USGS-37 and USGS -38; analytical precision better than ±0.4‰ was achieved. 37 Cl/ 35 Cl isotope ratio analysis in perchlorate during laboratory biodegradation experiment with microbial cultures enriched from the contaminated soil in Israel resulted in isotope enrichment factor ε 37 Cl = -13.3 ± 1‰, which falls in the range reported previously for perchlorate biodegradation by pure microbial cultures. The proposed analytical method may significantly simplify the procedure for isotope analysis of perchlorate which is currently applied in environmental studies. Copyright © 2017. Published by Elsevier Ltd.

  3. Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes

    USGS Publications Warehouse

    Nord, G.L.

    1982-01-01

    Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.

  4. Pricing of common cosmetic surgery procedures: local economic factors trump supply and demand.

    PubMed

    Richardson, Clare; Mattison, Gennaya; Workman, Adrienne; Gupta, Subhas

    2015-02-01

    The pricing of cosmetic surgery procedures has long been thought to coincide with laws of basic economics, including the model of supply and demand. However, the highly variable prices of these procedures indicate that additional economic contributors are probable. The authors sought to reassess the fit of cosmetic surgery costs to the model of supply and demand and to determine the driving forces behind the pricing of cosmetic surgery procedures. Ten plastic surgery practices were randomly selected from each of 15 US cities of various population sizes. Average prices of breast augmentation, mastopexy, abdominoplasty, blepharoplasty, and rhytidectomy in each city were compared with economic and demographic statistics. The average price of cosmetic surgery procedures correlated substantially with population size (r = 0.767), cost-of-living index (r = 0.784), cost to own real estate (r = 0.714), and cost to rent real estate (r = 0.695) across the 15 US cities. Cosmetic surgery pricing also was found to correlate (albeit weakly) with household income (r = 0.436) and per capita income (r = 0.576). Virtually no correlations existed between pricing and the density of plastic surgeons (r = 0.185) or the average age of residents (r = 0.076). Results of this study demonstrate a correlation between costs of cosmetic surgery procedures and local economic factors. Cosmetic surgery pricing cannot be completely explained by the supply-and-demand model because no association was found between procedure cost and the density of plastic surgeons. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  5. 40 CFR 86.214-94 - Analytical gases.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for 1994 and Later Model Year Gasoline-Fueled New Light-Duty Vehicles, New Light-Duty Trucks and New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.214-94 Analytical gases. The provisions of § 86...

  6. The areal reduction factor: A new analytical expression for the Lazio Region in central Italy

    NASA Astrophysics Data System (ADS)

    Mineo, C.; Ridolfi, E.; Napolitano, F.; Russo, F.

    2018-05-01

    For the study and modeling of hydrological phenomena, both in urban and rural areas, a proper estimation of the areal reduction factor (ARF) is crucial. In this paper, we estimated the ARF from observed rainfall data as the ratio between the average rainfall occurring in a specific area and the point rainfall. Then, we compared the obtained ARF values with some of the most widespread empirical approaches in literature which are used when rainfall observations are not available. Results highlight that the literature formulations can lead to a substantial over- or underestimation of the ARF estimated from observed data. These findings can have severe consequences, especially in the design of hydraulic structures where empirical formulations are extensively applied. The aim of this paper is to present a new analytical relationship with an explicit dependence on the rainfall duration and area that can better represent the ARF-area trend over the area case of study. The analytical curve presented here can find an important application to estimate the ARF values for design purposes. The test study area is the Lazio Region (central Italy).

  7. Correlation of the Capacity Factor in Vesicular Electrokinetic Chromatography with the Octanol:Water Partition Coefficient for Charged and Neutral Analytes

    PubMed Central

    Razak, J. L.; Cutak, B. J.; Larive, C. K.; Lunte, C. E.

    2008-01-01

    Purpose The aim of this study was to develop a method based upon electrokinetic chromatography (EKC) using oppositely charged surfactant vesicles as a buffer modifier to estimate hydrophobicity (log P) for a range of neutral and charged compounds. Methods Vesicles were formed from cetyltrimethylammonium bromide (CTAB) and sodium n-octyl sulfate (SOS). The size and polydispersity of the vesicles were characterized by electron microscopy, dynamic light scattering, and pulsed-field gradient NMR (PFG-NMR). PFG-NMR was also used to determine if ion-pairing between cationic analytes and free SOS monomer occurred. The CTAB/SOS vesicles were used as a buffer modifier in capillary electrophoresis (CE). The capacity factor (log k′) was calculated by determining the mobility of the analytes both in the presence and absence of vesicles. Log k′ was determined for 29 neutral and charged analytes. Results There was a linear relationship between the log of capacity factor (log k′) and octanol/water partition coefficient (log P) for both neutral and basic species at pH 6.0, 7.3, and 10.2. This indicated that interaction between the cation and vesicle was dominated by hydrophobic forces. At pH 4.3, the log k′ values for the least hydrophobic basic analytes were higher than expected, indicating that electrostatic attraction as well as hydrophobic forces contributed to the overall interaction between the cation and vesicle. Anionic compounds could not be evaluated using this system. Conclusion Vesicular electrokinetic chromatography (VEKC) using surfactant vesicles as buffer modifiers is a promising method for the estimation of hydrophobicity. PMID:11336344

  8. Initial Investigation of Factors Influencing Radiation Dose to Patients Undergoing Barium-Based Fluoroscopy Procedures in Tanzania.

    PubMed

    Ngaile, J E; Msaki, P K; Kazema, R R; Schreiner, L J

    2017-04-25

    The aim of this study was to investigate the nature and causes of radiation dose imparted to patients undergoing barium-based X-ray fluoroscopy procedures in Tanzania and to compare these doses to those reported in the literature from other regions worldwide. The air kerma area product (KAP) to patient undergoing barium investigations of gastrointestinal tract system was obtained from four consultant hospitals. The KAP was determined using a flat transparent transmission ionization chamber. Mean values of KAP for barium swallow (BS), barium meal (BM) and barium enema (BE) were 2.79, 2.62 and 15.04 Gy cm2, respectively. The mean values of KAP per hospital for the BS, BM and BE procedures varied by factors of up to 7.3, 1.6 and 2.0, respectively. The overall difference between individual patient doses across the four consultant hospitals investigated differed by factors of up to 53, 29.5 and 12 for the BS, BM and BE procedures, respectively. The majority of the mean values of KAP was lower than the reported values for Ghana, Greece, Spain and the UK, while slightly higher than those reported for India. The observed wide variation of KAP values for the same fluoroscopy procedure within and among the hospitals was largely attributed to the dynamic nature of the procedures, the patient characteristics, the skills and experience of personnel, and the different examination protocols employed among hospitals. The observed great variations of procedural protocols and patient doses within and across the hospitals call for the need to standardize examination protocols and optimize barium-based fluoroscopy procedures. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Application of analytical methods in authentication and adulteration of honey.

    PubMed

    Siddiqui, Amna Jabbar; Musharraf, Syed Ghulam; Choudhary, M Iqbal; Rahman, Atta-Ur-

    2017-02-15

    Honey is synthesized from flower nectar and it is famous for its tremendous therapeutic potential since ancient times. Many factors influence the basic properties of honey including the nectar-providing plant species, bee species, geographic area, and harvesting conditions. Quality and composition of honey is also affected by many other factors, such as overfeeding of bees with sucrose, harvesting prior to maturity, and adulteration with sugar syrups. Due to the complex nature of honey, it is often challenging to authenticate the purity and quality by using common methods such as physicochemical parameters and more specialized procedures need to be developed. This article reviews the literature (between 2000 and 2016) on the use of analytical techniques, mainly NMR spectroscopy, for authentication of honey, its botanical and geographical origin, and adulteration by sugar syrups. NMR is a powerful technique and can be used as a fingerprinting technique to compare various samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    PubMed

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  11. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based proceduremore » system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as

  12. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Miller, Dwight Peter

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less

  13. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  14. Latent Class Detection and Class Assignment: A Comparison of the MAXEIG Taxometric Procedure and Factor Mixture Modeling Approaches

    ERIC Educational Resources Information Center

    Lubke, Gitta; Tueller, Stephen

    2010-01-01

    Taxometric procedures such as MAXEIG and factor mixture modeling (FMM) are used in latent class clustering, but they have very different sets of strengths and weaknesses. Taxometric procedures, popular in psychiatric and psychopathology applications, do not rely on distributional assumptions. Their sole purpose is to detect the presence of latent…

  15. Development of coring procedures applied to Si, CdTe, and CIGS solar panels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moutinho, H. R.; Johnston, S.; To, B.

    Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less

  16. Development of coring procedures applied to Si, CdTe, and CIGS solar panels

    DOE PAGES

    Moutinho, H. R.; Johnston, S.; To, B.; ...

    2018-01-04

    Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less

  17. Analytic and numeric Green's functions for a two-dimensional electron gas in an orthogonal magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresti, Alessandro; Grosso, Giuseppe; Parravicini, Giuseppe Pastori

    2006-05-15

    We have derived closed analytic expressions for the Green's function of an electron in a two-dimensional electron gas threaded by a uniform perpendicular magnetic field, also in the presence of a uniform electric field and of a parabolic spatial confinement. A workable and powerful numerical procedure for the calculation of the Green's functions for a large infinitely extended quantum wire is considered exploiting a lattice model for the wire, the tight-binding representation for the corresponding matrix Green's function, and the Peierls phase factor in the Hamiltonian hopping matrix element to account for the magnetic field. The numerical evaluation of themore » Green's function has been performed by means of the decimation-renormalization method, and quite satisfactorily compared with the analytic results worked out in this paper. As an example of the versatility of the numerical and analytic tools here presented, the peculiar semilocal character of the magnetic Green's function is studied in detail because of its basic importance in determining magneto-transport properties in mesoscopic systems.« less

  18. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  19. Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.

  20. Fast analytical model of MZI micro-opto-mechanical pressure sensor

    NASA Astrophysics Data System (ADS)

    Rochus, V.; Jansen, R.; Goyvaerts, J.; Neutens, P.; O’Callaghan, J.; Rottenberg, X.

    2018-06-01

    This paper presents a fast analytical procedure in order to design a micro-opto-mechanical pressure sensor (MOMPS) taking into account the mechanical nonlinearity and the optical losses. A realistic model of the photonic MZI is proposed, strongly coupled to a nonlinear mechanical model of the membrane. Based on the membrane dimensions, the residual stress, the position of the waveguide, the optical wavelength and the phase variation due to the opto-mechanical coupling, we derive an analytical model which allows us to predict the response of the total system. The effect of the nonlinearity and the losses on the total performance are carefully studied and measurements on fabricated devices are used to validate the model. Finally, a design procedure is proposed in order to realize fast design of this new type of pressure sensor.

  1. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  2. Determination of Total Solids and Ash in Algal Biomass: Laboratory Analytical Procedure (LAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Wychen, Stefanie; Laurens, Lieve M. L.

    2016-01-13

    This procedure describes the methods used to determine the amount of moisture or total solids present in a freeze-dried algal biomass sample, as well as the ash content. A traditional convection oven drying procedure is covered for total solids content, and a dry oxidation method at 575 deg. C is covered for ash content.

  3. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  4. SRC-I demonstration plant analytical laboratory methods manual. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klusaritz, M.L.; Tewari, K.C.; Tiedge, W.F.

    1983-03-01

    This manual is a compilation of analytical procedures required for operation of a Solvent-Refined Coal (SRC-I) demonstration or commercial plant. Each method reproduced in full includes a detailed procedure, a list of equipment and reagents, safety precautions, and, where possible, a precision statement. Procedures for the laboratory's environmental and industrial hygiene modules are not included. Required American Society for Testing and Materials (ASTM) methods are cited, and ICRC's suggested modifications to these methods for handling coal-derived products are provided.

  5. PFOS and PFOS: Analytics | Science Inventory | US EPA

    EPA Pesticide Factsheets

    This presentation describes the drivers for development of Method 537, the extraction and analytical procedure, performance data, holding time data as well as detection limits. The purpose of this presentation is to provide an overview of EPA drinking water Method 537 to the U.S. EPA Drinking Water Workshop participants.

  6. The factor structure of the Alcohol Use Disorders Identification Test (AUDIT).

    PubMed

    Doyle, Suzanne R; Donovan, Dennis M; Kivlahan, Daniel R

    2007-05-01

    Past research assessing the factor structure of the Alcohol Use Disorders Identification Test (AUDIT) with various exploratory and confirmatory factor analytic techniques has identified one-, two-, and three-factor solutions. Because different factor analytic procedures may result in dissimilar findings, we examined the factor structure of the AUDIT using the same factor analytic technique on two new large clinical samples and on archival data from six samples studied in previous reports. Responses to the AUDIT were obtained from participants who met Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), criteria for alcohol dependence in two large randomized clinical trials: the COMBINE (Combining Medications and Behavioral Interventions) Study (N = 1,337; 69% men) and Project MATCH (Matching Alcoholism Treatments to Client Heterogeneity; N = 1,711; 76% men). Supplementary analyses involved six correlation matrices of AUDIT data obtained from five previously published articles. Confirmatory factor analyses based on one-, two-, and three-factor models were conducted on the eight correlation matrices to assess the factor structure of the AUDIT. Across samples, analyses supported a correlated, two-factor solution representing alcohol consumption and alcohol-related consequences. The three-factor solution fit the data equally well, but two factors (alcohol dependence and harmful alcohol use) were highly correlated. The one-factor solution did not provide a good fit to the data. These findings support a two-factor solution for the AUDIT (alcohol consumption and alcohol-related consequences). The results contradict the original three-factor design of the AUDIT and the prevalent use of the AUDIT as a one-factor screening instrument with a single cutoff score.

  7. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed Central

    2013-01-01

    Background Medical knowledge encompasses both conceptual (facts or “what” information) and procedural knowledge (“how” and “why” information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Methods Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Conclusions Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the

  8. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed

    Schmidmaier, Ralf; Eiber, Stephan; Ebersbach, Rene; Schiller, Miriam; Hege, Inga; Holzer, Matthias; Fischer, Martin R

    2013-02-22

    Medical knowledge encompasses both conceptual (facts or "what" information) and procedural knowledge ("how" and "why" information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences

  9. Nonequilibrium chemistry boundary layer integral matrix procedure

    NASA Technical Reports Server (NTRS)

    Tong, H.; Buckingham, A. C.; Morse, H. L.

    1973-01-01

    The development of an analytic procedure for the calculation of nonequilibrium boundary layer flows over surfaces of arbitrary catalycities is described. An existing equilibrium boundary layer integral matrix code was extended to include nonequilibrium chemistry while retaining all of the general boundary condition features built into the original code. For particular application to the pitch-plane of shuttle type vehicles, an approximate procedure was developed to estimate the nonequilibrium and nonisentropic state at the edge of the boundary layer.

  10. Comparison of analytical and predictive methods for water, protein, fat, sugar, and gross energy in marine mammal milk.

    PubMed

    Oftedal, O T; Eisert, R; Barrell, G K

    2014-01-01

    Mammalian milks may differ greatly in composition from cow milk, and these differences may affect the performance of analytical methods. High-fat, high-protein milks with a preponderance of oligosaccharides, such as those produced by many marine mammals, present a particular challenge. We compared the performance of several methods against reference procedures using Weddell seal (Leptonychotes weddellii) milk of highly varied composition (by reference methods: 27-63% water, 24-62% fat, 8-12% crude protein, 0.5-1.8% sugar). A microdrying step preparatory to carbon-hydrogen-nitrogen (CHN) gas analysis slightly underestimated water content and had a higher repeatability relative standard deviation (RSDr) than did reference oven drying at 100°C. Compared with a reference macro-Kjeldahl protein procedure, the CHN (or Dumas) combustion method had a somewhat higher RSDr (1.56 vs. 0.60%) but correlation between methods was high (0.992), means were not different (CHN: 17.2±0.46% dry matter basis; Kjeldahl 17.3±0.49% dry matter basis), there were no significant proportional or constant errors, and predictive performance was high. A carbon stoichiometric procedure based on CHN analysis failed to adequately predict fat (reference: Röse-Gottlieb method) or total sugar (reference: phenol-sulfuric acid method). Gross energy content, calculated from energetic factors and results from reference methods for fat, protein, and total sugar, accurately predicted gross energy as measured by bomb calorimetry. We conclude that the CHN (Dumas) combustion method and calculation of gross energy are acceptable analytical approaches for marine mammal milk, but fat and sugar require separate analysis by appropriate analytic methods and cannot be adequately estimated by carbon stoichiometry. Some other alternative methods-low-temperature drying for water determination; Bradford, Lowry, and biuret methods for protein; the Folch and the Bligh and Dyer methods for fat; and enzymatic and reducing

  11. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  12. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  13. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    PubMed

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  14. Analytical performance of a bronchial genomic classifier.

    PubMed

    Hu, Zhanzhi; Whitney, Duncan; Anderson, Jessica R; Cao, Manqiu; Ho, Christine; Choi, Yoonha; Huang, Jing; Frink, Robert; Smith, Kate Porta; Monroe, Robert; Kennedy, Giulia C; Walsh, P Sean

    2016-02-26

    The current standard practice of lung lesion diagnosis often leads to inconclusive results, requiring additional diagnostic follow up procedures that are invasive and often unnecessary due to the high benign rate in such lesions (Chest 143:e78S-e92, 2013). The Percepta bronchial genomic classifier was developed and clinically validated to provide more accurate classification of lung nodules and lesions that are inconclusive by bronchoscopy, using bronchial brushing specimens (N Engl J Med 373:243-51, 2015, BMC Med Genomics 8:18, 2015). The analytical performance of the Percepta test is reported here. Analytical performance studies were designed to characterize the stability of RNA in bronchial brushing specimens during collection and shipment; analytical sensitivity defined as input RNA mass; analytical specificity (i.e. potentially interfering substances) as tested on blood and genomic DNA; and assay performance studies including intra-run, inter-run, and inter-laboratory reproducibility. RNA content within bronchial brushing specimens preserved in RNAprotect is stable for up to 20 days at 4 °C with no changes in RNA yield or integrity. Analytical sensitivity studies demonstrated tolerance to variation in RNA input (157 ng to 243 ng). Analytical specificity studies utilizing cancer positive and cancer negative samples mixed with either blood (up to 10 % input mass) or genomic DNA (up to 10 % input mass) demonstrated no assay interference. The test is reproducible from RNA extraction through to Percepta test result, including variation across operators, runs, reagent lots, and laboratories (standard deviation of 0.26 for scores on > 6 unit scale). Analytical sensitivity, analytical specificity and robustness of the Percepta test were successfully verified, supporting its suitability for clinical use.

  15. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective

    NASA Astrophysics Data System (ADS)

    Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.

    2009-04-01

    Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the

  17. Risk factors for hospital morbidity and mortality after the Norwood procedure: A report from the Pediatric Heart Network Single Ventricle Reconstruction trial.

    PubMed

    Tabbutt, Sarah; Ghanayem, Nancy; Ravishankar, Chitra; Sleeper, Lynn A; Cooper, David S; Frank, Deborah U; Lu, Minmin; Pizarro, Christian; Frommelt, Peter; Goldberg, Caren S; Graham, Eric M; Krawczeski, Catherine Dent; Lai, Wyman W; Lewis, Alan; Kirsh, Joel A; Mahony, Lynn; Ohye, Richard G; Simsic, Janet; Lodge, Andrew J; Spurrier, Ellen; Stylianou, Mario; Laussen, Peter

    2012-10-01

    We sought to identify risk factors for mortality and morbidity during the Norwood hospitalization in newborn infants with hypoplastic left heart syndrome and other single right ventricle anomalies enrolled in the Single Ventricle Reconstruction trial. Potential predictors for outcome included patient- and procedure-related variables and center volume and surgeon volume. Outcome variables occurring during the Norwood procedure and before hospital discharge or stage II procedure included mortality, end-organ complications, length of ventilation, and hospital length of stay. Univariate and multivariable Cox regression analyses were performed with bootstrapping to estimate reliability for mortality. Analysis included 549 subjects prospectively enrolled from 15 centers; 30-day and hospital mortality were 11.5% (63/549) and 16.0% (88/549), respectively. Independent risk factors for both 30-day and hospital mortality included lower birth weight, genetic abnormality, extracorporeal membrane oxygenation (ECMO) and open sternum on the day of the Norwood procedure. In addition, longer duration of deep hypothermic circulatory arrest was a risk factor for 30-day mortality. Shunt type at the end of the Norwood procedure was not a significant risk factor for 30-day or hospital mortality. Independent risk factors for postoperative renal failure (n = 46), sepsis (n = 93), increased length of ventilation, and hospital length of stay among survivors included genetic abnormality, lower center/surgeon volume, open sternum, and post-Norwood operations. Innate patient factors, ECMO, open sternum, and lower center/surgeon volume are important risk factors for postoperative mortality and/or morbidity during the Norwood hospitalization. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  18. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  19. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    PubMed

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  20. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  1. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    PubMed

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  2. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China

    PubMed Central

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-01-01

    Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390

  3. 42 CFR 493.1256 - Standard: Control procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...

  4. Determining organisation-specific factors for developing health interventions in companies by a Delphi procedure: Organisational Mapping.

    PubMed

    van Scheppingen, Arjella R; ten Have, Kristin C J M; Zwetsloot, Gerard J I M; Kok, Gerjo; van Mechelen, Willem

    2015-12-01

    Companies, seen as social communities, are major health promotion contexts. However, health promotion in the work setting is often less successful than intended. An optimal adjustment to the organisational context is required. Knowledge of which organisation-specific factors are relevant to health promotion is scarce. A Delphi procedure is used to identify these factors. The aim is to contribute to more effective workplace health promotion. The identified factors are described and embedded into a practical methodology (Intervention Mapping). A systematic use of these factors (called 'Organisational Mapping') is likely to contribute to more effective health promotion in the work setting. © The Author(s) 2014.

  5. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations

    PubMed Central

    Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.

    2015-01-01

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076

  6. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations.

    PubMed

    Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M

    2015-04-08

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.

  7. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  8. Analytical, anthropometric and dietary factors associated with the development of fibrosis in patients with nonalcoholic fatty liver disease.

    PubMed

    Gómez de la Cuesta, Sara; Aller de la Fuente, Rocío; Tafur Sánchez, Carla; Izaola, Olatz; García Sánchez, Concepción; Mora, Natalia; González Hernández, Jose Manuel; de Luis Román, Daniel

    2018-05-01

    a prolonged non-alcoholic steatohepatitis (NASH) condition can lead to advanced stages of liver disease and the development of hepatocellular carcinoma. to evaluate analytical, anthropometric and dietary factors associated with the presence of fibrosis as this is the factor that most influences survival and evolution. seventy-six patients with liver biopsy-diagnosed non-alcoholic fatty liver disease (NAFLD) were included. Biopsies were scored considering the NASH criteria of Kleiner. Analytical, anthropometric and dietary (survey) parameters were obtained. NAFLD-FS is a non-invasive fibrosis index and was assessed for each patient. Leptin, adiponectin, resistin and TNF-alpha serum levels were determined. fifty-six patients were male (73.7%) and the mean age was 44.5 ± 11.3 years of age (19-68). Thirty-nine (51.3%) (F1-F2: 84.6%; F3-4: 15.4%) patients had fibrosis in the liver biopsy. Seventeen females (85%) had fibrosis versus 22 males (39%), which was statistically significant by univariate analysis (p < 0.01). Patients with advanced fibrosis were older, with lower platelet counts, lower serum albumin, greater homeostatic model assessment insulin resistance (HOMA-IR), lower dietary lipids percentage, higher serum leptin levels and higher NAFLD Fibrosis Score (NAFLD-FS) values. This index had a negative predictive value of 98% and a positive predictive value of 60% for the detection of fibrosis. Variables independently associated with fibrosis (logistic regression) included male gender (protective factor) (0.09, 95% CI 0.01-0.7; p < 0.05) and HOMA-IR (1.7, 95% CI, 1.03-2.79; p < 0.05). gender and HOMA-IR were the only independent factors associated with fibrosis. NAFLD-FS could be considered as an accurate scoring system to rule out advanced fibrosis.

  9. An analytical hierarchy process-based study on the factors affecting legislation on plastic bags in the USA.

    PubMed

    Li, Zhongguo; Zhao, Fu

    2017-08-01

    Annually, a large number of used plastic shopping bags are released into the environment, posing significant threats to public health and wildlife. Owing to these concerns, many local, regional, and national governments around the world have passed legislation to ban or restrict the use of plastic shopping bags. However, in the USA there are only 18 states that have approved plastic bag bans/fees, and even within these states these regulations do not cover all cities or counties. There are many factors that could affect the development and implementation of these regulations. This article employs an analytical hierarchy process to analyse the factors that could impact the enactment of plastic bag regulations. Five impact factors are identified based on statistical data, that is, geographical location, interest of industry achievable, cost of living, level of economic development, and educational level of population. The weights of the five impact factors are determined and it is found that the possibility of banning or restricting plastic bags in general follows a certain pattern among all states.

  10. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  11. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    PubMed Central

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  12. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    PubMed

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  13. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    PubMed

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty

  14. Improvement of analytical dynamic models using modal test data

    NASA Technical Reports Server (NTRS)

    Berman, A.; Wei, F. S.; Rao, K. V.

    1980-01-01

    A method developed to determine maximum changes in analytical mass and stiffness matrices to make them consistent with a set of measured normal modes and natural frequencies is presented. The corrected model will be an improved base for studies of physical changes, boundary condition changes, and for prediction of forced responses. The method features efficient procedures not requiring solutions of the eigenvalue problem, and the ability to have more degrees of freedom than the test data. In addition, modal displacements are obtained for all analytical degrees of freedom, and the frequency dependence of the coordinate transformations is properly treated.

  15. Penetrating the Fog: Analytics in Learning and Education

    ERIC Educational Resources Information Center

    Siemens, George; Long, Phil

    2011-01-01

    Attempts to imagine the future of education often emphasize new technologies--ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that people cannot actually touch or see: "big data and analytics." Learning analytics is still in…

  16. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Safety and Procedural Success of Left Atrial Appendage Exclusion With the Lariat Device: A Systematic Review of Published Reports and Analytic Review of the FDA MAUDE Database.

    PubMed

    Chatterjee, Saurav; Herrmann, Howard C; Wilensky, Robert L; Hirshfeld, John; McCormick, Daniel; Frankel, David S; Yeh, Robert W; Armstrong, Ehrin J; Kumbhani, Dharam J; Giri, Jay

    2015-07-01

    The Lariat device has received US Food and Drug Administration (FDA) 510(k) clearance for soft-tissue approximation and is being widely used off-label for left atrial appendage (LAA) exclusion. A comprehensive analysis of safety and effectiveness has not been reported. To perform a systematic review of published literature to assess safety and procedural success, defined as successful closure of the LAA during the index procedure, of the Lariat device. We performed a formal analytic review of the FDA MAUDE (Manufacturer and User Facility Device Experience) database to compile adverse event reports from real-world practice with the Lariat. For the systematic review, PubMed, EMBASE, CINAHL, and the Cochrane Library were searched from January 2007 through August 2014 to identify all studies reporting use of the Lariat device in 3 or more patients. The FDA MAUDE database was queried for adverse events reports related to Lariat use. Data were abstracted in duplicate by 2 physician reviewers. Events from published literature were pooled using a generic inverse variance weighting with a random effects model. Cumulative and individual adverse events were also reported using the FDA MAUDE data set. Procedural adverse events and procedural success. In the systematic review, 5 reports of Lariat device use in 309 participants were identified. Specific complications weighted for inverse of variance of individual studies were urgent need for cardiac surgery (2.3%; 7 of 309 procedures) and death (0.3%; 1 of 309 procedures). Procedural success was 90.3% (279 of 309 procedures). In the FDA MAUDE database, there were 35 unique reports of adverse events with use of the Lariat device. Among these, we identified 5 adverse event reports that noted pericardial effusion and death and an additional 23 reported urgent cardiac surgery without mention of death. This review of published reports and case reports identified risks of adverse events with off-label use of the Lariat device for LAA

  18. Exploring the Efficacy of Behavioral Skills Training to Teach Basic Behavior Analytic Techniques to Oral Care Providers

    ERIC Educational Resources Information Center

    Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella

    2012-01-01

    Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…

  19. Risk factors for early cytologic abnormalities after loop electrosurgical excision procedure.

    PubMed

    Dietrich, Charles S; Yancey, Michael K; Miyazawa, Kunio; Williams, David L; Farley, John

    2002-02-01

    To evaluate risk factors for early cytologic abnormalities and recurrent cervical dysplasia after loop electrosurgical excision procedure (LEEP). A retrospective analysis was performed of all pathology records for LEEPs performed at our institution from January 1996 through July 1998. Follow-up cytology from 2 through 12 months after LEEP was reviewed. Patients with abnormal cytology were referred for further colposcopic evaluation. Statistical analysis using chi2 test for trend, proportional hazards model test, Fisher exact tests, and life table analysis were performed to identify risk factors for early cytologic abnormalities after LEEP and to determine relative risk of recurrent dysplasia. A total of 298 women underwent LEEP during the study period, and 29% of these had cytologic abnormalities after LEEP. Grade of dysplasia, ectocervical marginal status, endocervical marginal status, and glandular involvement with dysplasia were not found to be independent risk factors for early cytologic abnormalities. However, when risk factors were analyzed cumulatively, the abnormal cytology rate increased from 24% with no risk factors to 67% with three risk factors present (P =.037). Of patients with abnormal cytology after LEEP, 40% developed subsequent dysplasia, and the mean time to diagnosis was approximately 6 months. The relative risk of subsequent dysplasia ranged from a 20% increase to twice the risk if post-LEEP cytology was low-grade squamous intraepithelial lesion or high-grade squamous intraepithelial lesion, respectively. Based on these results, consideration should be given for early colposcopic examination of patients who have evidence of marginal involvement or endocervical glandular involvement with dysplasia. These patients are at increased risk for abnormal cytology and recurrent dysplasia. This initial visit should occur at 6 months, as the mean time to recurrence of dysplasia was 6.5 months.

  20. A new procedure for investigating three-dimensional stress fields in a thin plate with a through-the-thickness crack

    NASA Astrophysics Data System (ADS)

    Yi, Dake; Wang, TzuChiang

    2018-06-01

    In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.

  1. ASVCP quality assurance guidelines: control of preanalytical, analytical, and postanalytical factors for urinalysis, cytology, and clinical chemistry in veterinary laboratories.

    PubMed

    Gunn-Christie, Rebekah G; Flatland, Bente; Friedrichs, Kristen R; Szladovits, Balazs; Harr, Kendal E; Ruotsalo, Kristiina; Knoll, Joyce S; Wamsley, Heather L; Freeman, Kathy P

    2012-03-01

    In December 2009, the American Society for Veterinary Clinical Pathology (ASVCP) Quality Assurance and Laboratory Standards committee published the updated and peer-reviewed ASVCP Quality Assurance Guidelines on the Society's website. These guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports: (1) general analytical factors for veterinary laboratory performance and comparisons; (2) hematology, hemostasis, and crossmatching; and (3) clinical chemistry, cytology, and urinalysis. This particular report is one of 3 reports and documents recommendations for control of preanalytical, analytical, and postanalytical factors related to urinalysis, cytology, and clinical chemistry in veterinary laboratories and is adapted from sections 1.1 and 2.2 (clinical chemistry), 1.3 and 2.5 (urinalysis), 1.4 and 2.6 (cytology), and 3 (postanalytical factors important in veterinary clinical pathology) of these guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimal guidelines for quality assurance and quality control for veterinary laboratory testing and a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. © 2012 American Society for Veterinary Clinical Pathology.

  2. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  3. Fluence correction factors for graphite calorimetry in a low-energy clinical proton beam: I. Analytical and Monte Carlo simulations.

    PubMed

    Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A

    2013-05-21

    The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence

  4. Factors Associated with Increased Rates of Post-procedural Stroke or Death following Carotid Artery Stent Placement: A Systematic Review.

    PubMed Central

    Khan, Muhib; Qureshi, Adnan I

    2014-01-01

    Background and Purpose We provide an assessment of clinical, angiographic, and procedure related risk factors associated with stroke and/or death in patients undergoing carotid artery stent placement which will assist in patient stratification and identification of high-stent risk patients. Methods A comprehensive search of Medline from January 1st 1996 to December 31st 2011 was performed with key words “carotid artery stenosis”, “ carotid artery stenting”, “carotid artery stent placement”, “death” , ” mortality”, “stroke”, “outcome”, “clinical predictors”, “angiographic predictors”, was performed in various combinations. We independently abstracted data and assessed the quality of the studies. This analysis led to the selection of 71 articles for review. Results Clinical factors including age≥80 years, symptomatic status, procedure within 2 weeks of symptoms, chronic renal failure, diabetes mellitus, and hemispheric TIA were associated with stroke (ischemic or hemorrhagic) and death within 1 month after carotid artery stent placement. Angiographic factors including left carotid artery intervention, stenosis > 90%, ulcerated and calcified plaques, lesion length > 10mm, thrombus at the site, ostial involvement, predilation without EPD, ICA-CCA angulation > 60%, aortic arch type III, and aortic arch calcification were also associated with 1 month stroke and/or death. Intra-procedural platelet GP IIb/IIIa inhibitors, protamine use, multiple stents, predilatation prior to stent placement were associated with stroke (ischemic or hemorrhagic) and death after carotid artery stent placement. Intraprocedural use of embolic protection devices and stent design (open versus closed cell design) did not demonstrate a consistent relationship with 1 month stroke and/or death. Procedural statin use, and operator and center experience of more than 50 procedures per year were protective for 1 month stroke and/or death. Conclusions Our review

  5. Fitting Meta-Analytic Structural Equation Models with Complex Datasets

    ERIC Educational Resources Information Center

    Wilson, Sandra Jo; Polanin, Joshua R.; Lipsey, Mark W.

    2016-01-01

    A modification of the first stage of the standard procedure for two-stage meta-analytic structural equation modeling for use with large complex datasets is presented. This modification addresses two common problems that arise in such meta-analyses: (a) primary studies that provide multiple measures of the same construct and (b) the correlation…

  6. System identification of analytical models of damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J.-S.; Chen, S.-Y.; Berman, A.

    1984-01-01

    A procedure is presented for identifying linear nonproportionally damped system. The system damping is assumed to be representable by a real symmetric matrix. Analytical mass, stiffness and damping matrices which constitute an approximate representation of the system are assumed to be available. Given also are an incomplete set of measured natural frequencies, damping ratios and complex mode shapes of the structure, normally obtained from test data. A method is developed to find the smallest changes in the analytical model so that the improved model can exactly predict the measured modal parameters. The present method uses the orthogonality relationship to improve mass and damping matrices and the dynamic equation to find the improved stiffness matrix.

  7. Use of evidence in a categorization task: analytic and holistic processing modes.

    PubMed

    Greco, Alberto; Moretti, Stefania

    2017-11-01

    Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.

  8. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  9. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  10. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  11. Hierarchical Factor Structure of the Cognitive Assessment System: Variance Partitions from the Schmid-Leiman (1957) Procedure

    ERIC Educational Resources Information Center

    Canivez, Gary L.

    2011-01-01

    Orthogonal higher-order factor structure of the Cognitive Assessment System (CAS; Naglieri & Das, 1997a) for the 5-7 and 8-17 age groups in the CAS standardization sample is reported. Following the same procedure as recent studies of other prominent intelligence tests (Dombrowski, Watkins, & Brogan, 2009; Canivez, 2008; Canivez &…

  12. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  13. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist

    NASA Astrophysics Data System (ADS)

    Reveil, Mardochee; Sorg, Victoria C.; Cheng, Emily R.; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O.

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  14. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist.

    PubMed

    Reveil, Mardochee; Sorg, Victoria C; Cheng, Emily R; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  15. 40 CFR 63.786 - Test methods and procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...

  16. 40 CFR 63.786 - Test methods and procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...

  17. Postoperative central nervous system infection: incidence and associated factors in 2111 neurosurgical procedures.

    PubMed

    McClelland, Shearwood; Hall, Walter A

    2007-07-01

    Postoperative central nervous system infection (PCNSI) in patients undergoing neurosurgical procedures represents a serious problem that requires immediate attention. PCNSI most commonly manifests as meningitis, subdural empyema, and/or brain abscess. Recent studies (which have included a minimum of 1000 operations) have reported that the incidence of PCNSI after neurosurgical procedures is 5%-7%, and many physicians believe that the true incidence is even higher. To address this issue, we examined the incidence of PCNSI in a sizeable patient population. The medical records and postoperative courses for patients involved in 2111 neurosurgical procedures at our institution during 1991-2005 were reviewed retrospectively to determine the incidence of PCNSI, the identity of offending organisms, and the factors associated with infection. The median age of patients at the time of surgery was 45 years. Of the 1587 cranial operations, 14 (0.8%) were complicated by PCNSI, whereas none of the 32 peripheral nerve operations resulted in PCNSI. The remaining 492 operative cases involved spinal surgery, of which 2 (0.4%) were complicated by PCNSI. The overall incidence of PCNSI was 0.8% (occurring after 16 of 2111 operations); the incidence of bacterial meningitis was 0.3% (occurring after 4 of 1587 operations), and the incidence of brain abscess was 0.2% (occurring after 3 of 1587 operations). The most common offending organism was Staphylococcus aureus (8 cases; 50% of infections), followed by Propionibacterium acnes (4 cases; 25% of infections). Cerebrospinal fluid leakage, diabetes mellitus, and male sex were not associated with PCNSI (P>.05). In one of the largest neurosurgical studies to have investigated PCNSI, the incidence of infection after neurosurgical procedures was <1%--more than 6 times lower than that reported in recent series of comparable numerical size. Cerebrospinal fluid leak, diabetes mellitus, and male sex were not associated with an increased incidence of

  18. Role of microextraction sampling procedures in forensic toxicology.

    PubMed

    Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia

    2012-07-01

    The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.

  19. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  20. Analytical solution for the advection-dispersion transport equation in layered media

    USDA-ARS?s Scientific Manuscript database

    The advection-dispersion transport equation with first-order decay was solved analytically for multi-layered media using the classic integral transform technique (CITT). The solution procedure used an associated non-self-adjoint advection-diffusion eigenvalue problem that had the same form and coef...

  1. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  2. A novel second-order standard addition analytical method based on data processing with multidimensional partial least-squares and residual bilinearization.

    PubMed

    Lozano, Valeria A; Ibañez, Gabriela A; Olivieri, Alejandro C

    2009-10-05

    In the presence of analyte-background interactions and a significant background signal, both second-order multivariate calibration and standard addition are required for successful analyte quantitation achieving the second-order advantage. This report discusses a modified second-order standard addition method, in which the test data matrix is subtracted from the standard addition matrices, and quantitation proceeds via the classical external calibration procedure. It is shown that this novel data processing method allows one to apply not only parallel factor analysis (PARAFAC) and multivariate curve resolution-alternating least-squares (MCR-ALS), but also the recently introduced and more flexible partial least-squares (PLS) models coupled to residual bilinearization (RBL). In particular, the multidimensional variant N-PLS/RBL is shown to produce the best analytical results. The comparison is carried out with the aid of a set of simulated data, as well as two experimental data sets: one aimed at the determination of salicylate in human serum in the presence of naproxen as an additional interferent, and the second one devoted to the analysis of danofloxacin in human serum in the presence of salicylate.

  3. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. 40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...

  5. 40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 10 2013-07-01 2013-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...

  6. Different results on tetrachorical correlations in Mplus and Stata--Stata announces modified procedure.

    PubMed

    Günther, Agnes; Höfler, Michael

    2006-01-01

    To identify the structure of mental disorders in large-scale epidemiological data sets, investigators frequently use tetrachoric correlations as a first step for subsequent application of latent class and factor analytic methods. It has been possible to do this with Stata since 2005, whereas the corresponding Mplus routine has been on the market for some years. Using an identical data set we observed considerable differences between the results of the packages. This paper illustrates the differences with several examples from the Early Developmental Stages of Psychopathology Study data set, which consists of 3021 subjects, with diagnostic information assessed by the CIDI. Results reveal that tetrachoric correlations resulting from Mplus were often considerably smaller than those computed with Stata. The results were dramatically different, especially where there were few observation per cell or even empty cells. These findings were put to Mplus and Stata, whose responses clarified the discrepancies by describing the different mathematical assumptions and procedures used. Stata announced that it intended to launch a modified procedure.

  7. Women's Career Success: A Factor Analytic Study of Contributing Factors.

    ERIC Educational Resources Information Center

    Gaskill, LuAnn Ricketts

    1991-01-01

    A survey of 466 women employed in retailing received 205 responses identifying (1) factors influencing the success and advancement of women in retailing and (2) how those factors differ for women in upper versus middle positions. Upper-level executives placed more importance on ambition and abilities; midlevel executives credited opportunity and…

  8. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. The Prioritization of Clinical Risk Factors of Obstructive Sleep Apnea Severity Using Fuzzy Analytic Hierarchy Process

    PubMed Central

    Maranate, Thaya; Pongpullponsak, Adisak; Ruttanaumpawan, Pimon

    2015-01-01

    Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%). PMID:26221183

  10. Factor structure of the Wechsler Intelligence Scale for Children-Fifth Edition: Exploratory factor analyses with the 16 primary and secondary subtests.

    PubMed

    Canivez, Gary L; Watkins, Marley W; Dombrowski, Stefan C

    2016-08-01

    The factor structure of the 16 Primary and Secondary subtests of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V; Wechsler, 2014a) standardization sample was examined with exploratory factor analytic methods (EFA) not included in the WISC-V Technical and Interpretive Manual (Wechsler, 2014b). Factor extraction criteria suggested 1 to 4 factors and results favored 4 first-order factors. When this structure was transformed with the Schmid and Leiman (1957) orthogonalization procedure, the hierarchical g-factor accounted for large portions of total and common variance while the 4 first-order factors accounted for small portions of total and common variance; rendering interpretation at the factor index level less appropriate. Although the publisher favored a 5-factor model where the Perceptual Reasoning factor was split into separate Visual Spatial and Fluid Reasoning dimensions, no evidence for 5 factors was found. It was concluded that the WISC-V provides strong measurement of general intelligence and clinical interpretation should be primarily, if not exclusively, at that level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Optimization of analytical and pre-analytical conditions for MALDI-TOF-MS human urine protein profiles.

    PubMed

    Calvano, C D; Aresta, A; Iacovone, M; De Benedetto, G E; Zambonin, C G; Battaglia, M; Ditonno, P; Rutigliano, M; Bettocchi, C

    2010-03-11

    Protein analysis in biological fluids, such as urine, by means of mass spectrometry (MS) still suffers for insufficient standardization in protocols for sample collection, storage and preparation. In this work, the influence of these variables on healthy donors human urine protein profiling performed by matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was studied. A screening of various urine sample pre-treatment procedures and different sample deposition approaches on the MALDI target was performed. The influence of urine samples storage time and temperature on spectral profiles was evaluated by means of principal component analysis (PCA). The whole optimized procedure was eventually applied to the MALDI-TOF-MS analysis of human urine samples taken from prostate cancer patients. The best results in terms of detected ions number and abundance in the MS spectra were obtained by using home-made microcolumns packed with hydrophilic-lipophilic balance (HLB) resin as sample pre-treatment method; this procedure was also less expensive and suitable for high throughput analyses. Afterwards, the spin coating approach for sample deposition on the MALDI target plate was optimized, obtaining homogenous and reproducible spots. Then, PCA indicated that low storage temperatures of acidified and centrifuged samples, together with short handling time, allowed to obtain reproducible profiles without artifacts contribution due to experimental conditions. Finally, interesting differences were found by comparing the MALDI-TOF-MS protein profiles of pooled urine samples of healthy donors and prostate cancer patients. The results showed that analytical and pre-analytical variables are crucial for the success of urine analysis, to obtain meaningful and reproducible data, even if the intra-patient variability is very difficult to avoid. It has been proven how pooled urine samples can be an interesting way to make easier the comparison between

  12. Some Factor Analytic Approximations to Latent Class Structure.

    ERIC Educational Resources Information Center

    Dziuban, Charles D.; Denton, William T.

    Three procedures, alpha, image, and uniqueness rescaling, were applied to a joint occurrence probability matrix. That matrix was the basis of a well-known latent class structure. The values of the recurring subscript elements were varied as follows: Case 1 - The known elements were input; Case 2 - The upper bounds to the recurring subscript…

  13. Sequential Multiplex Analyte Capturing for Phosphoprotein Profiling*

    PubMed Central

    Poetz, Oliver; Henzler, Tanja; Hartmann, Michael; Kazmaier, Cornelia; Templin, Markus F.; Herget, Thomas; Joos, Thomas O.

    2010-01-01

    Microarray-based sandwich immunoassays can simultaneously detect dozens of proteins. However, their use in quantifying large numbers of proteins is hampered by cross-reactivity and incompatibilities caused by the immunoassays themselves. Sequential multiplex analyte capturing addresses these problems by repeatedly probing the same sample with different sets of antibody-coated, magnetic suspension bead arrays. As a miniaturized immunoassay format, suspension bead array-based assays fulfill the criteria of the ambient analyte theory, and our experiments reveal that the analyte concentrations are not significantly changed. The value of sequential multiplex analyte capturing was demonstrated by probing tumor cell line lysates for the abundance of seven different receptor tyrosine kinases and their degree of phosphorylation and by measuring the complex phosphorylation pattern of the epidermal growth factor receptor in the same sample from the same cavity. PMID:20682761

  14. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  15. Risk factors for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory: a matched case-control study.

    PubMed

    Conway, Aaron; Page, Karen; Rolley, John; Fulbrook, Paul

    2013-08-01

    Side effects of the medications used for procedural sedation and analgesia in the cardiac catheterisation laboratory are known to cause impaired respiratory function. Impaired respiratory function poses considerable risk to patient safety as it can lead to inadequate oxygenation. Having knowledge about the conditions that predict impaired respiratory function prior to the procedure would enable nurses to identify at-risk patients and selectively implement intensive respiratory monitoring. This would reduce the possibility of inadequate oxygenation occurring. To identify pre-procedure risk factors for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Retrospective matched case-control. 21 cases of impaired respiratory function were identified and matched to 113 controls from a consecutive cohort of patients over 18 years of age. Conditional logistic regression was used to identify risk factors for impaired respiratory function. With each additional indicator of acute illness, case patients were nearly two times more likely than their controls to experience impaired respiratory function (OR 1.78; 95% CI 1.19-2.67; p = 0.005). Indicators of acute illness included emergency admission, being transferred from a critical care unit for the procedure or requiring respiratory or haemodynamic support in the lead up to the procedure. Several factors that predict the likelihood of impaired respiratory function were identified. The results from this study could be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory.

  16. Analytical learning and term-rewriting systems

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  17. 40 CFR 86.1207-96 - Sampling and analytical systems; evaporative emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) Evaporative Emission Test Procedures for New Gasoline-Fueled, Natural Gas-Fueled, Liquefied Petroleum Gas-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1207-96 Sampling and analytical systems..., the enclosure shall be gas tight in accordance with § 86.1217-96. Interior surfaces must be...

  18. 40 CFR 86.1207-96 - Sampling and analytical systems; evaporative emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) Evaporative Emission Test Procedures for New Gasoline-Fueled, Natural Gas-Fueled, Liquefied Petroleum Gas-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1207-96 Sampling and analytical systems..., the enclosure shall be gas tight in accordance with § 86.1217-96. Interior surfaces must be...

  19. Analytical Energy Gradients for Excited-State Coupled-Cluster Methods

    NASA Astrophysics Data System (ADS)

    Wladyslawski, Mark; Nooijen, Marcel

    equations for the wavefunction amplitudes, the Lagrange multipliers, and the analytical gradient via the perturbation-independent generalized Hellmann-Feynman effective density matrix. This systematic automated derivation procedure is applied to obtain the detailed gradient equations for the excitation energy (EE-), double ionization potential (DIP-), and double electron affinity (DEA-) similarity transformed equation-of-motion coupled-cluster singles-and-doubles (STEOM-CCSD) methods. In addition, the derivatives of the closed-shell-reference excitation energy (EE-), ionization potential (IP-), and electron affinity (EA-) equation-of-motion coupled-cluster singles-and-doubles (EOM-CCSD) methods are derived. Furthermore, the perturbative EOM-PT and STEOM-PT gradients are obtained. The algebraic derivative expressions for these dozen methods are all derived here uniformly through the automated Lagrange multiplier process and are expressed compactly in a chain-rule/intermediate-density formulation, which facilitates a unified modular implementation of analytic energy gradients for CCSD/PT-based electronic methods. The working equations for these analytical gradients are presented in full detail, and their factorization and implementation into an efficient computer code are discussed.

  20. Quantitative and Qualitative Relations between Motivation and Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Miele, David B.; Wigfield, Allan

    2014-01-01

    The authors examine two kinds of factors that affect students' motivation to engage in critical-analytic thinking. The first, which includes ability beliefs, achievement values, and achievement goal orientations, influences the "quantitative" relation between motivation and critical-analytic thinking; that is, whether students are…

  1. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82...

  2. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  3. Toxicologic evaluation of analytes from Tank 241-C-103

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahlum, D.D.; Young, J.Y.; Weller, R.E.

    1994-11-01

    Westinghouse Hanford Company requested PNL to assemble a toxicology review panel (TRP) to evaluate analytical data compiled by WHC, and provide advice concerning potential health effects associated with exposure to tank-vapor constituents. The team`s objectives would be to (1) review procedures used for sampling vapors from tanks, (2) identify constituents in tank-vapor samples that could be related to symptoms reported by workers, (3) evaluate the toxicological implications of those constituents by comparison to establish toxicological databases, (4) provide advice for additional analytical efforts, and (5) support other activities as requested by WHC. The TRP represents a wide range of expertise,more » including toxicology, industrial hygiene, and occupational medicine. The TRP prepared a list of target analytes that chemists at the Oregon Graduate Institute/Sandia (OGI), Oak Ridge National Laboratory (ORNL), and PNL used to establish validated methods for quantitative analysis of head-space vapors from Tank 241-C-103. this list was used by the analytical laboratories to develop appropriate analytical methods for samples from Tank 241-C-103. Target compounds on the list included acetone, acetonitrile, ammonia, benzene, 1, 3-butadiene, butanal, n-butanol, hexane, 2-hexanone, methylene chloride, nitric oxide, nitrogen dioxide, nitrous oxide, dodecane, tridecane, propane nitrile, sulfur oxide, tributyl phosphate, and vinylidene chloride. The TRP considered constituent concentrations, current exposure limits, reliability of data relative to toxicity, consistency of the analytical data, and whether the material was carcinogenic or teratogenic. A final consideration in the analyte selection process was to include representative chemicals for each class of compounds found.« less

  4. Analytic study of orbiter landing profiles

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1981-01-01

    A broad survey of possible orbiter landing configurations was made with specific goals of defining boundaries for the landing task. The results suggest that the center of the corridors between marginal and routine represents a more or less optimal preflare condition for regular operations. Various constraints used to define the boundaries are based largely on qualitative judgements from earlier flight experience with the X-15 and lifting body research aircraft. The results should serve as useful background for expanding and validating landing simulation programs. The analytic approach offers a particular advantage in identifying trends due to the systematic variation of factors such as vehicle weight, load factor, approach speed, and aim point. Limitations such as a constant load factor during the flare and using a fixed gear deployment time interval, can be removed by increasing the flexibility of the computer program. This analytic definition of landing profiles of the orbiter may suggest additional studies, includin more configurations or more comparisons of landing profiles within and beyond the corridor boundaries.

  5. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--NHEXAS FILTER HANDLING, WEIGHING AND ARCHIVING PROCEDURES FOR AEROSOL SAMPLES (RTI/ACS-AP-209-011)

    EPA Science Inventory

    This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...

  6. Impact of Anatomical, Procedural, and Operator Skill Factors on the Success and Duration of Fluoroscopy-Guided Transjugular Intrahepatic Portosystemic Shunt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marquardt, Steffen, E-mail: marquardt.steffen@mh-hannover.de; Rodt, Thomas, E-mail: rodt.thomas@mh-hannover.de; Rosenthal, Herbert, E-mail: rosenthal.herbert@mh-hannover.de

    PurposeTo assess the impact of anatomical, procedural, and operator skill factors on the success and duration of fluoroscopy-guided transjugular intrahepatic portoystemic shunt following standard operating procedure (SOP).Material and MethodsDuring a 32-month period, 102 patients underwent transjugular intrahepatic portosystemic shunt creation (TIPS) by two interventional radiologists (IR) following our institutional SOP based on fluoroscopy guidance. Both demographic and procedural data were assessed. The duration of the intervention (D{sub Int}) and of the portal vein puncture (D{sub Punct}) was analyzed depending on the skill level of the IR as well as the anatomic or procedural factors.ResultsIn 99 of the 102 patients, successfulmore » TIPS without peri-procedural complications was performed. The mean D{sub Int} (IR1: 77 min; IR2: 51 min, P < 0.005) and the mean D{sub Punct} (IR1: 19 min; IR2: 13 min, P < 0.005) were significantly higher in TIPS performed by IR1 (with 2 years of clinical experience performing TIPS, n = 38) than by IR2 (>10 years of clinical experience performing TIPS, n = 61), (P < 0.005 both, Mann–Whitney U test). D{sub Int} showed a higher correlation with D{sub Punct} for IR2 (R{sup 2} = 0.63) than for IR1 (R{sup 2} = 0.13). There was no significant difference in the D{sub Punct} for both IRs with regard to the success of the wedged portography (P = 0.90), diameter of the portal vein (P = 0.60), central right portal vein length (P = 0.49), or liver function (MELD-Score before the TIPS procedure; P = 0.14).ConclusionTIPS following SOP is safe, fast, and reliable. The only significant factor for shorter D{sub Punct} and D{sub Int} was the clinical experience of the IR. Anatomic variability, successful portography, or liver function did not alter the duration or technical success of TIPS.« less

  7. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    NASA Technical Reports Server (NTRS)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  8. Glyoxal and methylglyoxal as urinary markers of diabetes. Determination using a dispersive liquid-liquid microextraction procedure combined with gas chromatography-mass spectrometry.

    PubMed

    Pastor-Belda, M; Fernández-García, A J; Campillo, N; Pérez-Cárceles, M D; Motas, M; Hernández-Córdoba, M; Viñas, P

    2017-08-04

    Glyoxal (GO) and methylglyoxal (MGO) are α-oxoaldehydes that can be used as urinary diabetes markers. In this study, their levels were measured using a sample preparation procedure based on salting-out assisted liquid-liquid extraction (SALLE) and dispersive liquid-liquid microextraction (DLLME) combined with gas chromatography-mass spectrometry (GC-MS). The effect of the derivatization reaction with 2,3-diaminonaphthalene, the addition of acetonitrile and sodium chloride to urine, and the DLLME step using the acetonitrile extract as dispersant solvent and carbon tetrachloride as extractant solvent were carefully optimized. Quantification was performed by the internal standard method, using 5-bromo-2-chloroanisole. The intraday and interday precisions were lower than 6%. Limits of detection were 0.12 and 0.06ngmL -1 , and enrichment factors 140 and 130 for GO and MGO, respectively. The concentrations of these α-oxoaldehydes in urine were between 0.9 and 35.8ngg -1 levels (creatinine adjusted). A statistical comparison of the analyte contents of urine samples from non-diabetic and diabetic patients pointed to significant differences (P=0.046, 24 subjects investigated), particularly regarding MGO, which was higher in diabetic patients. The novelty of this study compared with previous procedures lies in the treatment of the urine sample by SALLE based on the addition of acetonitrile and sodium chloride to the urine. The DLLME procedure is performed with a sedimented drop of the extractant solvent, without a surfactant reagent, and using acetonitrile as dispersant solvent. Separation of the analytes was performed using GC-MS detection, being the analytes unequivocal identified. The proposed procedure is the first microextraction method applied to the analysis of urine samples from diabetic and non-diabetic patients that allows a clear differentiation between both groups using a simple analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. A behavior-analytic critique of Bandura's self-efficacy theory

    PubMed Central

    Biglan, Anthony

    1987-01-01

    A behavior-analytic critique of self-efficacy theory is presented. Self-efficacy theory asserts that efficacy expectations determine approach behavior and physiological arousal of phobics as well as numerous other clinically important behaviors. Evidence which is purported to support this assertion is reviewed. The evidence consists of correlations between self-efficacy ratings and other behaviors. Such response-response relationships do not unequivocally establish that one response causes another. A behavior-analytic alternative to self-efficacy theory explains these relationships in terms of environmental events. Correlations between self-efficacy rating behavior and other behavior may be due to the contingencies of reinforcement that establish a correspondence between such verbal predictions and the behavior to which they refer. Such a behavior-analytic account does not deny any of the empirical relationships presented in support of self-efficacy theory, but it points to environmental variables that could account for those relationships and that could be manipulated in the interest of developing more effective treatment procedures. PMID:22477956

  10. Analytical display design for flight tasks conducted under instrument meteorological conditions. [human factors engineering of pilot performance for display device design in instrument landing systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1976-01-01

    Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.

  11. Nine-analyte detection using an array-based biosensor

    NASA Technical Reports Server (NTRS)

    Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.

    2002-01-01

    A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.

  12. Analytical procedures for environmental quality control. Volume 2. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, L.K.; Wang, M.H.S.

    1989-01-15

    This report covers sixteen important documents. Some examples are: The determination of the maximum total trihalomethane potential; Nationwide approval of alternative test procedure for analysis of trihalomethanes; Volatile organic compounds in eater by purge and trap capillary column gas chromatography with photoionization and electrolytic conductivity detectors in series; Analysis of organohalide pesticides and arclors in drinking water by microextraction and gas chromatography; Testing for lead in school drinking water; Simplified methods for food and feed testing; Determination of nitroaromatic compounds and isophorone in industrial and municipal wastewaters; Sampling for giardia and/or cryptosporidium; determination of TCDD in industrial and municipal wastewaters;more » Determination of volatile organics in industrial and municipal wastewaters; Determination of polynuclear aromatic hydrocarbons in industrial and municipal wastewaters.« less

  13. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  14. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  15. General Analytical Procedure for Determination of Acidity Parameters of Weak Acids and Bases

    PubMed Central

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied. PMID:25692072

  16. General analytical procedure for determination of acidity parameters of weak acids and bases.

    PubMed

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  17. Violent Video Game Effects on Aggression, Empathy, and Prosocial Behavior in Eastern and Western Countries: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba

    2010-01-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…

  18. Analytical and experimental investigation of a 1/8-scale dynamic model of the shuttle orbiter. Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.

    1974-01-01

    The methods and procedures used in the analysis and testing of the scale model are reported together with the correlation of the analytical and experimental results. The model, the NASTRAN finite element analysis, and results are discussed. Tests and analytical investigations are also reported.

  19. Analytical studies of the Space Shuttle orbiter nose-gear tire

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.; Robinson, Martha P.

    1991-01-01

    A computational procedure is presented for evaluating the analytic sensitivity derivatives of the tire response with respect to material and geometrical properties of the tire. The tire is modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The computational procedure is applied to the case of the Space Shuttle orbiter nose-gear tire subjected to uniform inflation pressure. Numerical results are presented which show the sensitivity of the different tire response quantities to variations in the material characteristics of both the cord and rubber.

  20. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  1. Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Chan, Wai

    2005-01-01

    Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…

  2. An Analytic Hierarchy Process-based Method to Rank the Critical Success Factors of Implementing a Pharmacy Barcode System.

    PubMed

    Alharthi, Hana; Sultana, Nahid; Al-Amoudi, Amjaad; Basudan, Afrah

    2015-01-01

    Pharmacy barcode scanning is used to reduce errors during the medication dispensing process. However, this technology has rarely been used in hospital pharmacies in Saudi Arabia. This article describes the barriers to successful implementation of a barcode scanning system in Saudi Arabia. A literature review was conducted to identify the relevant critical success factors (CSFs) for a successful dispensing barcode system implementation. Twenty-eight pharmacists from a local hospital in Saudi Arabia were interviewed to obtain their perception of these CSFs. In this study, planning (process flow issues and training requirements), resistance (fear of change, communication issues, and negative perceptions about technology), and technology (software, hardware, and vendor support) were identified as the main barriers. The analytic hierarchy process (AHP), one of the most widely used tools for decision making in the presence of multiple criteria, was used to compare and rank these identified CSFs. The results of this study suggest that resistance barriers have a greater impact than planning and technology barriers. In particular, fear of change is the most critical factor, and training is the least critical factor.

  3. Determination of immersion factors for radiance sensors in marine and inland waters: a semi-analytical approach using refractive index approximation

    NASA Astrophysics Data System (ADS)

    Dev, Pravin J.; Shanmugam, P.

    2016-05-01

    Underwater radiometers are generally calibrated in air using a standard source. The immersion factors are required for these radiometers to account for the change in the in-water measurements with respect to in-air due to the different refractive index of the medium. The immersion factors previously determined for RAMSES series of commercial radiometers manufactured by TriOS are applicable to clear oceanic waters. In typical inland and turbid productive coastal waters, these experimentally determined immersion factors yield significantly large errors in water-leaving radiances (Lw) and hence remote sensing reflectances (Rrs). To overcome this limitation, a semi-analytical method with based on the refractive index approximation is proposed in this study, with the aim of obtaining reliable Lw and Rrs from RAMSES radiometers for turbid and productive waters within coastal and inland water environments. We also briefly show the variation of pure water immersion factors (Ifw) and newly derived If on Lw and Rrs for clear and turbid waters. The remnant problems other than the immersion factor coefficients such as transmission, air-water and water-air Fresnel's reflectances are also discussed.

  4. Factors that influence length of stay for in-patient gynaecology surgery: is the Case Mix Group (CMG) or type of procedure more important?

    PubMed

    Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole

    2006-02-01

    To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.

  5. Rapid analytical procedure for determination of mineral oils in edible oil by GC-FID.

    PubMed

    Wrona, Magdalena; Pezo, Davinson; Nerin, Cristina

    2013-12-15

    A procedure for the determination of mineral oils in edible oil has been fully developed. The procedure consists of using a sulphuric acid-impregnated silica gel (SAISG) glass column to eliminate the fat matter. A chemical combustion of the fatty acids takes place, while the mineral oils are not affected by the sulphuric acid. The column is eluted with hexane using a vacuum pump and the final extract is concentrated and analysed by gas chromatography (GC) with flame ionisation detector (FID). The detection limit (LOD) and the quantification limit (LOQ) in hexane were 0.07 and 0.21 μg g(-1) respectively and the LOQ in vegetable oil was 1 μg g(-1). Only a few minutes were necessary for sample treatment to have a clean extract. The efficiency of the process, measured through the recoveries from spiked samples of edible oil was higher than 95%. The procedure has been applied to determine mineral oil in olive oil from the retailed market. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  7. Analytical sensor redundancy assessment

    NASA Technical Reports Server (NTRS)

    Mulcare, D. B.; Downing, L. E.; Smith, M. K.

    1988-01-01

    The rationale and mechanization of sensor fault tolerance based on analytical redundancy principles are described. The concept involves the substitution of software procedures, such as an observer algorithm, to supplant additional hardware components. The observer synthesizes values of sensor states in lieu of their direct measurement. Such information can then be used, for example, to determine which of two disagreeing sensors is more correct, thus enhancing sensor fault survivability. Here a stability augmentation system is used as an example application, with required modifications being made to a quadruplex digital flight control system. The impact on software structure and the resultant revalidation effort are illustrated as well. Also, the use of an observer algorithm for wind gust filtering of the angle-of-attack sensor signal is presented.

  8. [DEONTOLOGICAL QUESTIONS IN PROPHYLACTIC OF ENDOSCOPIC COMPLICATIONS: THE SIGNIFICANCE OF RATIONAL AND PSYCHOLOGICAL FACTORS (analytical overview)].

    PubMed

    Vernik, N V; Ivantsova, M A; Yashin, D I

    2015-01-01

    To evaluate the ways of reduction complications during endoscopic procedures based on principals of professional ethics and improving the quality of working area. Data of fundamental literature, evidence based medicine, science publications and internet portals. Deontology is the fundamental principle of medical practice and one of the main factors of professional effectiveness. Complications in endoscopy are often the investigations of deviation from the deontological principals. The whole number of psychological factors influences on professional activity of endoscopists, where the emotional "burn-out" syndrome (EBS) occupies one of the main places. Prophylactic and timely relief of EBS serves improvement of the practical work quality. Creation of favorable working area is the strategically important task in prophylactics of endoscopy complications. The questions of practical realization of deontological principles in endoscopy are the subject of further discussion.

  9. Development of electrical test procedures for qualification of spacecraft against EID. Volume 2: Review and specification of test procedures

    NASA Technical Reports Server (NTRS)

    Wilkenfeld, J. M.; Harlacher, B. L.; Mathews, D.

    1982-01-01

    A combined experimental and analytical program to develop system electrical test procedures for the qualification of spacecraft against damage produced by space-electron-induced discharges (EID) occurring on spacecraft dielectric outer surfaces is described. A review and critical evaluation of possible approaches to qualify spacecraft against space electron-induced discharges (EID) is presented. A variety of possible schemes to simulate EID electromagnetic effects produced in spacecraft was studied. These techniques form the principal element of a provisional, recommended set of test procedures for the EID qualification spacecraft. Significant gaps in our knowledge about EID which impact the final specification of an electrical test to qualify spacecraft against EID are also identified.

  10. Do the Critical Success Factors from Learning Analytics Predict Student Outcomes?

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2016-01-01

    This article starts with a detailed literature review of recent studies that focused on using learning analytics software or learning management system data to determine the nature of any relationships between online student activity and their academic outcomes within university-level business courses. The article then describes how data was…

  11. Evaluation of Factors Influencing Accuracy of Principal Procedure Coding Based on ICD-9-CM: An Iranian Study

    PubMed Central

    Farzandipour, Mehrdad; Sheikhtaheri, Abbas

    2009-01-01

    To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647

  12. Impact of Pre-analytic Blood Sample Collection Factors on Metabolomics.

    PubMed

    Townsend, Mary K; Bao, Ying; Poole, Elizabeth M; Bertrand, Kimberly A; Kraft, Peter; Wolpin, Brian M; Clish, Clary B; Tworoger, Shelley S

    2016-05-01

    Many epidemiologic studies are using metabolomics to discover markers of carcinogenesis. However, limited data are available on the influence of pre-analytic blood collection factors on metabolite measurement. We quantified 166 metabolites in archived plasma from 423 Health Professionals Follow-up Study and Nurses' Health Study participants using liquid chromatography-tandem mass spectrometry (LC-MS). We compared multivariable-adjusted geometric mean metabolite LC-MS peak areas across fasting time, season of blood collection, and time of day of blood collection categories. The majority of metabolites (160 of 166 metabolites) had geometric mean peak areas that were within 15% comparing samples donated after fasting 9 to 12 versus ≥13 hours; greater differences were observed in samples donated after fasting ≤4 hours. Metabolite peak areas generally were similar across season of blood collection, although levels of certain metabolites (e.g., bile acids and purines/pyrimidines) tended to be different in the summer versus winter months. After adjusting for fasting status, geometric mean peak areas for bile acids and vitamins, but not other metabolites, differed by time of day of blood collection. Fasting, season of blood collection, and time of day of blood collection were not important sources of variability in measurements of most metabolites in our study. However, considering blood collection variables in the design or analysis of studies may be important for certain specific metabolites, particularly bile acids, purines/pyrimidines, and vitamins. These results may be useful for investigators formulating analysis plans for epidemiologic metabolomics studies, including determining which metabolites to a priori exclude from analyses. Cancer Epidemiol Biomarkers Prev; 25(5); 823-9. ©2016 AACR. ©2016 American Association for Cancer Research.

  13. Learning procedures from interactive natural language instructions

    NASA Technical Reports Server (NTRS)

    Huffman, Scott B.; Laird, John E.

    1994-01-01

    Despite its ubiquity in human learning, very little work has been done in artificial intelligence on agents that learn from interactive natural language instructions. In this paper, the problem of learning procedures from interactive, situated instruction is examined in which the student is attempting to perform tasks within the instructional domain, and asks for instruction when it is needed. Presented is Instructo-Soar, a system that behaves and learns in response to interactive natural language instructions. Instructo-Soar learns completely new procedures from sequences of instruction, and also learns how to extend its knowledge of previously known procedures to new situations. These learning tasks require both inductive and analytic learning. Instructo-Soar exhibits a multiple execution learning process in which initial learning has a rote, episodic flavor, and later executions allow the initially learned knowledge to be generalized properly.

  14. Seamless Digital Environment – Plan for Data Analytics Use Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the

  15. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    PubMed

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient

  16. Fused Deposition Modeling 3D Printing for (Bio)analytical Device Fabrication: Procedures, Materials, and Applications

    PubMed Central

    2017-01-01

    In this work, the use of fused deposition modeling (FDM) in a (bio)analytical/lab-on-a-chip research laboratory is described. First, the specifications of this 3D printing method that are important for the fabrication of (micro)devices were characterized for a benchtop FDM 3D printer. These include resolution, surface roughness, leakage, transparency, material deformation, and the possibilities for integration of other materials. Next, the autofluorescence, solvent compatibility, and biocompatibility of 12 representative FDM materials were tested and evaluated. Finally, we demonstrate the feasibility of FDM in a number of important applications. In particular, we consider the fabrication of fluidic channels, masters for polymer replication, and tools for the production of paper microfluidic devices. This work thus provides a guideline for (i) the use of FDM technology by addressing its possibilities and current limitations, (ii) material selection for FDM, based on solvent compatibility and biocompatibility, and (iii) application of FDM technology to (bio)analytical research by demonstrating a broad range of illustrative examples. PMID:28628294

  17. Analyzing the factors that influencing the success of post graduates in achieving graduate on time (GOT) using analytic hierarchy process (AHP)

    NASA Astrophysics Data System (ADS)

    Chin, Wan Yung; Ch'ng, Chee Keong; Jamil, Jastini Mohd.; Shaharanee, Izwan Nizal Mohd.

    2017-11-01

    In the globalization era, education plays an important role in educating and preparing individuals to face the demands and challenges of 21st century. Thus, this contributes to the increase of the number of individuals pursuing their studies in Doctor of Philosophy (Ph.D) program. However, the ability of Ph.D students in heading to the four years Graduate on Time (GOT) mission that is stipulated by University has become a major concern of students, institution and government. Therefore, the main objective of this study is to investigate the factors that influence the Ph.D students in Universiti Utara Malaysia (UUM) to achieve GOT. Through the reviewing of previous research, six factors which are student factor, financial factor, supervisor factor, skills factor, project factors and institution factor had been identified as the domain factors that influence the Ph.D students in achieving GOT. The level of importance for each factor will be ranked by the experts from three graduate schools using Analytic Hierarchy Process (AHP) technique. This study will bring a significant contribution to the understanding of factors that affecting the Ph.D students in UUM to achieve GOT. In Addition, this study can also succor the university in planning and assisting the Ph.D students to accomplish the GOT in future.

  18. [Perioperative complications after total hip revision surgery and their predictive factors. A series of 181 consecutive procedures].

    PubMed

    de Thomasson, E; Guingand, O; Terracher, R; Mazel, C

    2001-09-01

    We conducted a retrospective study to assess morbidity and mortality in patients undergoing revision total hip arthroplasty (THA) procedures. Perioperative complications were recorded in 181 revision procedures (162 patients) performed between January 1995 and March 1999 (117 bipolar revisions and 64 acetabular isolated revisions). There were 86 complications (68 patients) leading to 21 new revisions. About half (50/86) were related to the surgical procedure (dislocation, femoral fracture, infection.). Life-threatening complications (3.6%) ended in patient death in 1.6% of the cases. Complications were more frequent in patients with an ASA score=3 (p<0.01) or aged over 75 years (p<0.05). Age was also predictive of femoral misalignment and fracture (p<0.05). Dislocations (8.8%) were observed more frequently in patients who had undergone more than 2 procedures prior to the revision (p<0.05) (4.8% of the dislocations in patients undergoing a first revision procedure and 14.3% in the others). In addition, peroperative blood loss and duration of the procedure were significantly greater in case of bipolar replacement than for isolated acetabular replacement (sigma > 1.96). Our experience and data in the literature point to the important age factor in the development of complications. Preservation of a well-fixed femoral component does not appear to worsen prognosis and leads to fewer complications than bipolar changes. The decision to revise a THA must take into consideration the functional impairment but also the risks inherent in revision procedures, particularly in old patients who have undergone several procedures. Revising the acetabular component alone can be an interesting option if the femoral component remains well-fixed although our follow-up is insufficient to determine whether this attitude provides better long-term outcome than complete biopolar revision. Better patient selection and improved operative technique, in particular in femur preparation, should

  19. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82..., Environmental Protection, Volume II, Aircraft Engine Emissions, Second Edition, July 1993, effective July 26...

  20. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82..., Environmental Protection, Volume II, Aircraft Engine Emissions, Second Edition, July 1993, effective July 26...

  1. Bias and precision of selected analytes reported by the National Atmospheric Deposition Program and National Trends Network, 1984

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1987-01-01

    The U.S. Geological Survey operated a blind audit sample program during 1974 to test the effects of the sample handling and shipping procedures used by the National Atmospheric Deposition Program and National Trends Network on the quality of wet deposition data produced by the combined networks. Blind audit samples, which were dilutions of standard reference water samples, were submitted by network site operators to the central analytical laboratory disguised as actual wet deposition samples. Results from the analyses of blind audit samples were used to calculate estimates of analyte bias associated with all network wet deposition samples analyzed in 1984 and to estimate analyte precision. Concentration differences between double blind samples that were submitted to the central analytical laboratory and separate analyses of aliquots of those blind audit samples that had not undergone network sample handling and shipping were used to calculate analyte masses that apparently were added to each blind audit sample by routine network handling and shipping procedures. These calculated masses indicated statistically significant biases for magnesium, sodium , potassium, chloride, and sulfate. Median calculated masses were 41.4 micrograms (ug) for calcium, 14.9 ug for magnesium, 23.3 ug for sodium, 0.7 ug for potassium, 16.5 ug for chloride and 55.3 ug for sulfate. Analyte precision was estimated using two different sets of replicate measures performed by the central analytical laboratory. Estimated standard deviations were similar to those previously reported. (Author 's abstract)

  2. MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS

    EPA Science Inventory

    A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...

  3. Quantitative analysis of crystalline pharmaceuticals in powders and tablets by a pattern-fitting procedure using X-ray powder diffraction data.

    PubMed

    Yamamura, S; Momose, Y

    2001-01-16

    A pattern-fitting procedure for quantitative analysis of crystalline pharmaceuticals in solid dosage forms using X-ray powder diffraction data is described. This method is based on a procedure for pattern-fitting in crystal structure refinement, and observed X-ray scattering intensities were fitted to analytical expressions including some fitting parameters, i.e. scale factor, peak positions, peak widths and degree of preferred orientation of the crystallites. All fitting parameters were optimized by the non-linear least-squares procedure. Then the weight fraction of each component was determined from the optimized scale factors. In the present study, well-crystallized binary systems, zinc oxide-zinc sulfide (ZnO-ZnS) and salicylic acid-benzoic acid (SA-BA), were used as the samples. In analysis of the ZnO-ZnS system, the weight fraction of ZnO or ZnS could be determined quantitatively in the range of 5-95% in the case of both powders and tablets. In analysis of the SA-BA systems, the weight fraction of SA or BA could be determined quantitatively in the range of 20-80% in the case of both powders and tablets. Quantitative analysis applying this pattern-fitting procedure showed better reproducibility than other X-ray methods based on the linear or integral intensities of particular diffraction peaks. Analysis using this pattern-fitting procedure also has the advantage that the preferred orientation of the crystallites in solid dosage forms can be also determined in the course of quantitative analysis.

  4. A factor analytical study of tinnitus complaint behaviour.

    PubMed

    Jakes, S C; Hallam, R S; Chambers, C; Hinchcliffe, R

    1985-01-01

    Two separate factor analyses were conducted on various self-rated complaints about tinnitus and related neuro-otological symptoms, together with audiometric measurements of tinnitus 'intensity' (masking level and loudness matching levels). Two general tinnitus complaint factors were identified, i.e. 'intrusiveness of tinnitus' and 'distress due to tinnitus'. 3 specific tinnitus complaint factors were also found, i.e. 'sleep disturbance', 'medication use' and 'interference with passive auditory entertainments'. Other neuro-otological symptoms and the audiometric measures did not load on these factors. An exception was provided by loudness matches at 1 kHz, which had a small loading on the 'intrusiveness of tinnitus' factor. Self-rated loudness had a high loading on this factor. Otherwise, the loudness (either self-rated or determined by loudness matching) was unrelated to complaint dimensions. The clinical implications of the multifactorial nature of tinnitus complaint behaviour are considered.

  5. The ethical dimension of analytical psychology.

    PubMed

    Barreto, Marco Heleno

    2018-04-01

    The centrality of the ethical dimension in Carl Gustav Jung's analytical psychology is demonstrated through careful reference to fundamental moments in the Jungian text. Tracking Jung's statements about the primacy of the 'moral function' (or 'moral factor') in the cure of neurosis as well as in the process of individuation, the ethical nature of the psychotherapeutic praxis proposed by Jung is highlighted. This allows us to see the ethical aspect of psychological conflicts, and thus to understand better why individuation can be seen as a 'moral achievement'. Finally, the intelligible ethical structure of Jungian psychotherapeutic praxis is exposed. © 2018, The Society of Analytical Psychology.

  6. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  7. 48 CFR 45.202 - Evaluation procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Evaluation procedures. 45... MANAGEMENT GOVERNMENT PROPERTY Solicitation and Evaluation Procedures 45.202 Evaluation procedures. (a) The... evaluation purposes only, a rental equivalent evaluation factor. (b) The contracting officer shall ensure the...

  8. 40 CFR 91.414 - Raw gaseous exhaust sampling and analytical system description.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw gaseous exhaust sampling and... Gaseous Exhaust Test Procedures § 91.414 Raw gaseous exhaust sampling and analytical system description... the component systems. (g) The following requirements must be incorporated in each system used for raw...

  9. 40 CFR 91.414 - Raw gaseous exhaust sampling and analytical system description.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw gaseous exhaust sampling and... Gaseous Exhaust Test Procedures § 91.414 Raw gaseous exhaust sampling and analytical system description... the component systems. (g) The following requirements must be incorporated in each system used for raw...

  10. 40 CFR 89.412 - Raw gaseous exhaust sampling and analytical system description.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw gaseous exhaust sampling and...-IGNITION ENGINES Exhaust Emission Test Procedures § 89.412 Raw gaseous exhaust sampling and analytical... must be incorporated in each system used for raw testing under this subpart. (1) [Reserved] (2) The...

  11. 40 CFR 89.412 - Raw gaseous exhaust sampling and analytical system description.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw gaseous exhaust sampling and...-IGNITION ENGINES Exhaust Emission Test Procedures § 89.412 Raw gaseous exhaust sampling and analytical... must be incorporated in each system used for raw testing under this subpart. (1) [Reserved] (2) The...

  12. Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993

    ERIC Educational Resources Information Center

    O'Driscoll, Gillian A.; Callahan, Brandy L.

    2008-01-01

    Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…

  13. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  14. Flexible pavement overlay design procedures. Volume 1: Evaluation and modification of the design methods

    NASA Astrophysics Data System (ADS)

    Majidzadeh, K.; Ilves, G. J.

    1981-08-01

    A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.

  15. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  16. Negations in syllogistic reasoning: evidence for a heuristic-analytic conflict.

    PubMed

    Stupple, Edward J N; Waterhouse, Eleanor F

    2009-08-01

    An experiment utilizing response time measures was conducted to test dominant processing strategies in syllogistic reasoning with the expanded quantifier set proposed by Roberts (2005). Through adding negations to existing quantifiers it is possible to change problem surface features without altering logical validity. Biases based on surface features such as atmosphere, matching, and the probability heuristics model (PHM; Chater & Oaksford, 1999; Wetherick & Gilhooly, 1995) would not be expected to show variance in response latencies, but participant responses should be highly sensitive to changes in the surface features of the quantifiers. In contrast, according to analytic accounts such as mental models theory and mental logic (e.g., Johnson-Laird & Byrne, 1991; Rips, 1994) participants should exhibit increased response times for negated premises, but not be overly impacted upon by the surface features of the conclusion. Data indicated that the dominant response strategy was based on a matching heuristic, but also provided evidence of a resource-demanding analytic procedure for dealing with double negatives. The authors propose that dual-process theories offer a stronger account of these data whereby participants employ competing heuristic and analytic strategies and fall back on a heuristic response when analytic processing fails.

  17. Dosimetric factors for diagnostic nuclear medicine procedures in a non-reference pregnant phantom.

    PubMed

    Rafat-Motavalli, Laleh; Miri Hakimabad, Hashem; Hoseinian Azghadi, Elie

    2018-05-01

    This study was evaluated the impact of using non-reference fetal models on the fetal radiation dose from diagnostic radionuclide administration. The 6 month pregnant phantoms including fetal models at 10th and 90th growth percentiles were constructed at either end of the normal range around the 50th percentile and implemented in the Monte Carlo N-Particle code version MCNPX 2.6. The code have been used then to evaluate the 99mTc S factors of interested target organs as the most common used radionuclide in nuclear medicine procedures. Substantial variations were observed in the S factors between the 10th/90th percentile phantoms from the 50th percentile phantom, with the greatest difference being 38.6 %. When the source organs were in close proximity to, or inside the fetal body, the 99mTc S factors presented strong statistical correlations with fetal body habitus. The trends observed in the S factors and the differences between various percentiles were justified by the source organs' masses, and chord length distributions (CLDs). The results of this study showed that fetal body habitus had a considerable effect on fetal dose (on average up to 8.4%) if constant fetal biokinetic data was considered for all fetal weight percentiles. However, an almost smaller variation on fetal dose (up to 5.3%) was obtained if the available biokinetic data for the reference fetus was scaled by fetal mass. © 2018 IOP Publishing Ltd.

  18. Analytical practice: do the new technologies have an impact?

    PubMed

    Favero, Davide; Candellieri, Stefano

    2017-06-01

    Through commentary on four clinical vignettes, this article focuses on the anthropological transformations taking place in contemporary society, underlining their differences from the anthropologies of reference of the founding fathers of psychoanalysis. Hybridization between man and machine and the speeding up and alteration of communications which the new technologies promote are now crucial issues facing psychoanalysis. Social media and a 24/7 internet connection have produced deep changes in the way people live and perceive relationships. Analytical practice is not exempt from such issues, which can be particularly insidious, often subtle and difficult to recognize, or even underestimated or ignored by psychoanalysts outright, in order to preserve the illusion of a complete understanding of what unfolds in the analytical space. The authors suggest that such transformations, by (partially) rendering inadequate the theoretical and technical corpus on which the various depth psychologies are founded, require personal engagement on the part of psychoanalysts in the search for new strategies to treat their patients, with the consequent abandonment of the 'certainties' offered by sclerotic models of clinical procedure. © 2017, The Society of Analytical Psychology.

  19. Advanced Noise Abatement Procedures for a Supersonic Business Jet

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Jones, Scott M.; Seidel, Jonathan A.; Huff, Dennis L.

    2017-01-01

    Supersonic civil aircraft present a unique noise certification challenge. High specific thrust required for supersonic cruise results in high engine exhaust velocity and high levels of jet noise during takeoff. Aerodynamics of thin, low-aspect-ratio wings equipped with relatively simple flap systems deepen the challenge. Advanced noise abatement procedures have been proposed for supersonic aircraft. These procedures promise to reduce airport noise, but they may require departures from normal reference procedures defined in noise regulations. The subject of this report is a takeoff performance and noise assessment of a notional supersonic business jet. Analytical models of an airframe and a supersonic engine derived from a contemporary subsonic turbofan core are developed. These models are used to predict takeoff trajectories and noise. Results indicate advanced noise abatement takeoff procedures are helpful in reducing noise along lateral sidelines.

  20. Automated Deployment of Advanced Controls and Analytics in Buildings

    NASA Astrophysics Data System (ADS)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  1. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  2. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  3. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  4. A singularity free analytical solution of artificial satellite motion with drag

    NASA Technical Reports Server (NTRS)

    Scheifele, G.; Mueller, A. C.; Starke, S. E.

    1977-01-01

    The connection between the existing Delaunay-Similar and Poincare-Similar satellite theories in the true anomaly version is outlined for the J(2) perturbation and the new drag approach. An overall description of the concept of the approach is given while the necessary expansions and the procedure to arrive at the computer program for the canonical forces is delineated. The procedure for the analytical integration of these developed equations is described. In addition, some numerical results are given. The computer program for the algebraic multiplication of the Fourier series which creates the FORTRAN coding in an automatic manner is described and documented.

  5. Is enzymatic hydrolysis a reliable analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites in human fluids?

    PubMed

    Quifer-Rada, Paola; Martínez-Huélamo, Miriam; Lamuela-Raventos, Rosa M

    2017-07-19

    Phenolic compounds are present in human fluids (plasma and urine) mainly as glucuronidated and sulfated metabolites. Up to now, due to the unavailability of standards, enzymatic hydrolysis has been the method of choice in analytical chemistry to quantify these phase II phenolic metabolites. Enzymatic hydrolysis procedures vary in enzyme concentration, pH and temperature; however, there is a lack of knowledge about the stability of polyphenols in their free form during the process. In this study, we evaluated the stability of 7 phenolic acids, 2 flavonoids and 3 prenylflavanoids in urine during enzymatic hydrolysis to assess the suitability of this analytical procedure, using three different concentrations of β-glucuronidase/sulfatase enzymes from Helix pomatia. The results indicate that enzymatic hydrolysis negatively affected the recovery of the precursor and free-form polyphenols present in the sample. Thus, enzymatic hydrolysis does not seem an ideal analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites.

  6. Rational quality assessment procedure for less-investigated herbal medicines: Case of a Congolese antimalarial drug with an analytical report.

    PubMed

    Tshitenge, Dieudonné Tshitenge; Ioset, Karine Ndjoko; Lami, José Nzunzu; Ndelo-di-Phanzu, Josaphat; Mufusama, Jean-Pierre Koy Sita; Bringmann, Gerhard

    2016-04-01

    Herbal medicines are the most globally used type of medical drugs. Their high cultural acceptability is due to the experienced safety and efficiency over centuries of use. Many of them are still phytochemically less-investigated, and are used without standardization or quality control. Choosing SIROP KILMA, an authorized Congolese antimalarial phytomedicine, as a model case, our study describes an interdisciplinary approach for a rational quality assessment of herbal drugs in general. It combines an authentication step of the herbal remedy prior to any fingerprinting, the isolation of the major constituents, the development and validation of an HPLC-DAD analytical method with internal markers, and the application of the method to several batches of the herbal medicine (here KILMA) thus permitting the establishment of a quantitative fingerprint. From the constitutive plants of KILMA, acteoside, isoacteoside, stachannin A, and pectolinarigenin-7-O-glucoside were isolated, and acteoside was used as the prime marker for the validation of an analytical method. This study contributes to the efforts of the WHO for the establishment of standards enabling the analytical evaluation of herbal materials. Moreover, the paper describes the first phytochemical and analytical report on a marketed Congolese phytomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  8. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  9. Analytic materials

    PubMed Central

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882

  10. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    PubMed

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Final report on mid-polarity analytes in food matrix: mid-polarity pesticides in tea

    NASA Astrophysics Data System (ADS)

    Sin, Della W. M.; Li, Hongmei; Wong, S. K.; Lo, M. F.; Wong, Y. L.; Wong, Y. C.; Mok, C. S.

    2015-01-01

    At the Paris meeting in April 2011, the CCQM Working Group on Organic Analysis (OAWG) agreed on a suite of Track A studies meant to support the assessment of measurement capabilities needed for the delivery of measurement services within the scope of the OAWG Terms of Reference. One of the studies discussed and agreed upon for the suite of ten Track A studies that support the 5-year plan of the CCQM Core Competence assessment was CCQM-K95 'Mid-Polarity Analytes in Food Matrix: Mid-Polarity Pesticides in Tea'. This key comparison was co-organized by the Government Laboratory of Hong Kong Special Administrative Region (GL) and the National Institute of Metrology, China (NIM). To allow wider participation, a pilot study, CCQM-P136, was run in parallel. Participants' capabilities in measuring mid-polarity analytes in food matrix were demonstrated through this key comparison. Most of the participating NMIs/DIs successfully measured beta-endosulfan and endosulfan sulphate in the sample, however, there is room for further improvement for some participants. This key comparison involved not only extraction, clean-up, analytical separation and selective detection of the analytes in a complex food matrix, but also the pre-treatment procedures of the material before the extraction process. The problem of incomplete extraction of the incurred analytes from the sample matrix may not be observed simply by using spike recovery. The relative standard deviations for the data included in the KCRV calculation in this key comparison were less than 7 % which was acceptable given the complexity of the matrix, the level of the analytes and the complexity of the analytical procedure. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by CCQM, according to the provisions of the CIPM Mutual

  12. SFCHECK: a unified set of procedures for evaluating the quality of macromolecular structure-factor data and their agreement with the atomic model.

    PubMed

    Vaguine, A A; Richelle, J; Wodak, S J

    1999-01-01

    In this paper we present SFCHECK, a stand-alone software package that features a unified set of procedures for evaluating the structure-factor data obtained from X-ray diffraction experiments and for assessing the agreement of the atomic coordinates with these data. The evaluation is performed completely automatically, and produces a concise PostScript pictorial output similar to that of PROCHECK [Laskowski, MacArthur, Moss & Thornton (1993). J. Appl. Cryst. 26, 283-291], greatly facilitating visual inspection of the results. The required inputs are the structure-factor amplitudes and the atomic coordinates. Having those, the program summarizes relevant information on the deposited structure factors and evaluates their quality using criteria such as data completeness, structure-factor uncertainty and the optical resolution computed from the Patterson origin peak. The dependence of various parameters on the nominal resolution (d spacing) is also given. To evaluate the global agreement of the atomic model with the experimental data, the program recomputes the R factor, the correlation coefficient between observed and calculated structure-factor amplitudes and Rfree (when appropriate). In addition, it gives several estimates of the average error in the atomic coordinates. The local agreement between the model and the electron-density map is evaluated on a per-residue basis, considering separately the macromolecule backbone and side-chain atoms, as well as solvent atoms and heterogroups. Among the criteria are the normalized average atomic displacement, the local density correlation coefficient and the polymer chain connectivity. The possibility of computing these criteria using the omit-map procedure is also provided. The described software should be a valuable tool in monitoring the refinement procedure and in assessing structures deposited in databases.

  13. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  14. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  15. Career Decision Statuses among Portuguese Secondary School Students: A Cluster Analytical Approach

    ERIC Educational Resources Information Center

    Santos, Paulo Jorge; Ferreira, Joaquim Armando

    2012-01-01

    Career indecision is a complex phenomenon and an increasing number of authors have proposed that undecided individuals do not form a group with homogeneous characteristics. This study examines career decision statuses among a sample of 362 12th-grade Portuguese students. A cluster-analytical procedure, based on a battery of instruments designed to…

  16. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR POLAR TARGET ANALYTES (SOP-5.26)

    EPA Science Inventory

    This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...

  17. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    PubMed Central

    Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine

    2015-01-01

    Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402

  18. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  19. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    PubMed

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  20. An analytical approach to Sr isotope ratio determination in Lambrusco wines for geographical traceability purposes.

    PubMed

    Durante, Caterina; Baschieri, Carlo; Bertacchini, Lucia; Bertelli, Davide; Cocchi, Marina; Marchetti, Andrea; Manzini, Daniela; Papotti, Giulia; Sighinolfi, Simona

    2015-04-15

    Geographical origin and authenticity of food are topics of interest for both consumers and producers. Among the different indicators used for traceability studies, (87)Sr/(86)Sr isotopic ratio has provided excellent results. In this study, two analytical approaches for wine sample pre-treatment, microwave and low temperature mineralisation, were investigated to develop accurate and precise analytical method for (87)Sr/(86)Sr determination. The two procedures led to comparable results (paired t-test, with tanalytical procedure was evaluated by using a control sample (wine sample), processed during each sample batch (calculated Relative Standard Deviation, RSD%, equal to 0.002%. Lambrusco PDO (Protected Designation of Origin) wines coming from four different vintages (2009, 2010, 2011 and 2012) were pre-treated according to the best procedure and their isotopic values were compared with isotopic data coming from (i) soils of their territory of origin and (ii) wines obtained by same grape varieties cultivated in different districts. The obtained results have shown no significant variability among the different vintages of wines and a perfect agreement between the isotopic range of the soils and wines has been observed. Nevertheless, the investigated indicator was not enough powerful to discriminate between similar products. To this regard, it is worth to note that more soil samples as well as wines coming from different districts will be considered to obtain more trustworthy results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  2. Methodological Research on Knowledge Use and School Improvement. Volume III. Measuring Knowledge Use: A Procedural Inventory.

    ERIC Educational Resources Information Center

    Dunn, William N.; And Others

    This volume presents in one collection a systematic inventory of research and analytic procedures appropriate for generating information on knowledge production, diffusion, and utilization, gathered by the University of Pittsburgh Program for the Study of Knowledge Use. The main concern is with those procedures that focus on the utilization of…

  3. Preliminary measurement of gas concentrations of perfluropropane using an analytical weighing balance.

    PubMed

    Clarkson, Douglas McG; Manna, Avinish; Hero, Mark

    2014-02-01

    We describe the use of an analytical weighing balance of measurement accuracy 0.00001g for determination of concentrations of perfluropropane (C3F8) gas used in ophthalmic surgical vitrectomy procedures. A range of test eyes corresponding to an eye volume of 6.1ml were constructed using 27 gauge needle exit ducts and separately 20 gauge (straight) and 23 gauge (angled) entrance ports. This method allowed determination of concentration levels in the sample preparation syringe and also levels in test eyes. It was determined that a key factor influencing gas concentrations accuracy related to the method of gas fill and the value of dead space of the gas preparation/delivery system and with a significant contribution arising from the use of the particle filter. The weighing balance technique was identified as an appropriate technique for estimation of gas concentrations. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  5. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  6. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Microemulsification: an approach for analytical determinations.

    PubMed

    Lima, Renato S; Shiroma, Leandro Y; Teixeira, Alvaro V N C; de Toledo, José R; do Couto, Bruno C; de Carvalho, Rogério M; Carrilho, Emanuel; Kubota, Lauro T; Gobbi, Angelo L

    2014-09-16

    We address a novel method for analytical determinations that combines simplicity, rapidity, low consumption of chemicals, and portability with high analytical performance taking into account parameters such as precision, linearity, robustness, and accuracy. This approach relies on the effect of the analyte content over the Gibbs free energy of dispersions, affecting the thermodynamic stabilization of emulsions or Winsor systems to form microemulsions (MEs). Such phenomenon was expressed by the minimum volume fraction of amphiphile required to form microemulsion (Φ(ME)), which was the analytical signal of the method. Thus, the measurements can be taken by visually monitoring the transition of the dispersions from cloudy to transparent during the microemulsification, like a titration. It bypasses the employment of electric energy. The performed studies were: phase behavior, droplet dimension by dynamic light scattering, analytical curve, and robustness tests. The reliability of the method was evaluated by determining water in ethanol fuels and monoethylene glycol in complex samples of liquefied natural gas. The dispersions were composed of water-chlorobenzene (water analysis) and water-oleic acid (monoethylene glycol analysis) with ethanol as the hydrotrope phase. The mean hydrodynamic diameter values for the nanostructures in the droplet-based water-chlorobenzene MEs were in the range of 1 to 11 nm. The procedures of microemulsification were conducted by adding ethanol to water-oleic acid (W-O) mixtures with the aid of micropipette and shaking. The Φ(ME) measurements were performed in a thermostatic water bath at 23 °C by direct observation that is based on the visual analyses of the media. The experiments to determine water demonstrated that the analytical performance depends on the composition of ME. It shows flexibility in the developed method. The linear range was fairly broad with limits of linearity up to 70.00% water in ethanol. For monoethylene glycol in

  8. Incidence and Risk Factors for Perioperative Cardiovascular and Respiratory Adverse Events in Pediatric Patients With Congenital Heart Disease Undergoing Noncardiac Procedures.

    PubMed

    Lee, Sandra; Reddington, Elise; Koutsogiannaki, Sophia; Hernandez, Michael R; Odegard, Kirsten C; DiNardo, James A; Yuki, Koichi

    2018-04-27

    While mortality and adverse perioperative events after noncardiac surgery in children with a broad range of congenital cardiac lesions have been investigated using large multiinstitutional databases, to date single-center studies addressing adverse outcomes in children with congenital heart disease (CHD) undergoing noncardiac surgery have only included small numbers of patients with significant heart disease. The primary objective of this study was to determine the incidences of perioperative cardiovascular and respiratory events in a large cohort of patients from a single institution with a broad range of congenital cardiac lesions undergoing noncardiac procedures and to determine risk factors for these events. We identified 3010 CHD patients presenting for noncardiac procedures in our institution over a 5-year period. We collected demographic information, including procedure performed, cardiac diagnosis, ventricular function as assessed by echocardiogram within 6 months of the procedure, and classification of CHD into 3 groups (minor, major, or severe CHD) based on residual lesion burden and cardiovascular functional status. Characteristics related to conduct of anesthesia care were also collected. The primary outcome variables for our analysis were the incidences of intraoperative cardiovascular and respiratory events. Univariable and multivariable logistic regressions were used to determine risk factors for these 2 outcomes. The incidence of cardiovascular events was 11.5% and of respiratory events was 4.7%. Univariate analysis and multivariable analysis demonstrated that American Society of Anesthesiologists (≥3), emergency cases, major and severe CHD, single-ventricle physiology, ventricular dysfunction, orthopedic surgery, general surgery, neurosurgery, and pulmonary procedures were associated with perioperative cardiovascular events. Respiratory events were associated with American Society of Anesthesiologists (≥4) and otolaryngology, gastrointestinal

  9. Loneliness and social isolation as risk factors for mortality: a meta-analytic review.

    PubMed

    Holt-Lunstad, Julianne; Smith, Timothy B; Baker, Mark; Harris, Tyler; Stephenson, David

    2015-03-01

    Actual and perceived social isolation are both associated with increased risk for early mortality. In this meta-analytic review, our objective is to establish the overall and relative magnitude of social isolation and loneliness and to examine possible moderators. We conducted a literature search of studies (January 1980 to February 2014) using MEDLINE, CINAHL, PsycINFO, Social Work Abstracts, and Google Scholar. The included studies provided quantitative data on mortality as affected by loneliness, social isolation, or living alone. Across studies in which several possible confounds were statistically controlled for, the weighted average effect sizes were as follows: social isolation odds ratio (OR) = 1.29, loneliness OR = 1.26, and living alone OR = 1.32, corresponding to an average of 29%, 26%, and 32% increased likelihood of mortality, respectively. We found no differences between measures of objective and subjective social isolation. Results remain consistent across gender, length of follow-up, and world region, but initial health status has an influence on the findings. Results also differ across participant age, with social deficits being more predictive of death in samples with an average age younger than 65 years. Overall, the influence of both objective and subjective social isolation on risk for mortality is comparable with well-established risk factors for mortality. © The Author(s) 2015.

  10. Loss Factor Estimation Using the Impulse Response Decay Method on a Stiffened Structure

    NASA Technical Reports Server (NTRS)

    Cabell, Randolph; Schiller, Noah; Allen, Albert; Moeller, Mark

    2009-01-01

    High-frequency vibroacoustic modeling is typically performed using energy-based techniques such as Statistical Energy Analysis (SEA). Energy models require an estimate of the internal damping loss factor. Unfortunately, the loss factor is difficult to estimate analytically, and experimental methods such as the power injection method can require extensive measurements over the structure of interest. This paper discusses the implications of estimating damping loss factors using the impulse response decay method (IRDM) from a limited set of response measurements. An automated procedure for implementing IRDM is described and then evaluated using data from a finite element model of a stiffened, curved panel. Estimated loss factors are compared with loss factors computed using a power injection method and a manual curve fit. The paper discusses the sensitivity of the IRDM loss factor estimates to damping of connected subsystems and the number and location of points in the measurement ensemble.

  11. [Analytical chemistry in works of Maria Skłodowska-Curie].

    PubMed

    Hulanicki, Adam

    2012-01-01

    Maria Skłodowska-Curie--a Nobel Prize winner in chemistry--the elements of learning of chemistry gained just by a dint of work of more than ten months in Warsaw in the Institute of Industry and Agriculture Museum. The Nobel Prize concerned a contribution to the progress of chemistry through the discovery of radium and polonium, separation of radium and study of properties of this amazing element. It was awarded for an extremely arduous work, during which the chemical reactions being the principles of analytical chemistry were realized. Unlike to a typical analytical procedure, an initial attempt here was the thousands of kilograms of uranium ore: pitchblende. The final effect was small amounts of new elements: polonium and radium. Both the knowledge and the intuition of the researcher let her have a triumph. The difficulties she experienced because the properties of the searched chemical elements could only be evaluated thanks to the knowledge on other chemical elements. A significant achievement was the determination of the samples by means of radioactivity measurement, which gave rise to radiochemical analytical methods. An extreme analytical precision was demanded in multiple processes of fractional crystallization and precipitation which finally led to the calculation of the atomic mass of radium.

  12. Analytic and heuristic processes in the detection and resolution of conflict.

    PubMed

    Ferreira, Mário B; Mata, André; Donkin, Christopher; Sherman, Steven J; Ihmels, Max

    2016-10-01

    Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dual-process models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and single-process accounts, which are discussed.

  13. Analytical Expressions for the Mixed-Order Kinetics Parameters of TL Glow Peaks Based on the two Heating Rates Method.

    PubMed

    Maghrabi, Mufeed; Al-Abdullah, Tariq; Khattari, Ziad

    2018-03-24

    The two heating rates method (originally developed for first-order glow peaks) was used for the first time to evaluate the activation energy (E) from glow peaks obeying mixed-order (MO) kinetics. The derived expression for E has an insignificant additional term (on the scale of a few meV) when compared with the first-order case. Hence, the original expression for E using the two heating rates method can be used with excellent accuracy in the case of MO glow peaks. In addition, we derived a simple analytical expression for the MO parameter. The present procedure has the advantage that the MO parameter can now be evaluated using analytical expression instead of using the graphical representation between the geometrical factor and the MO parameter as given by the existing peak shape methods. The applicability of the derived expressions for real samples was demonstrated for the glow curve of Li 2 B 4 O 7 :Mn single crystal. The obtained parameters compare very well with those obtained by glow curve fitting and with the available published data.

  14. Analytical approaches to determination of carnitine in biological materials, foods and dietary supplements.

    PubMed

    Dąbrowska, Monika; Starek, Małgorzata

    2014-01-01

    l-Carnitine is a vitamin-like amino acid derivative, which is an essential factor in fatty acid metabolism as acyltransferase cofactor and in energy production processes, such as interconversion in the mechanisms of regulation of cetogenesis and termogenesis, and it is also used in the therapy of primary and secondary deficiency, and in other diseases. The determination of carnitine and acyl-carnitines can provide important information about inherited or acquired metabolic disorders, and for monitoring the biochemical effect of carnitine therapy. The endogenous carnitine pool in humans is maintained by biosynthesis and absorption of carnitine from the diet. Carnitine has one asymmetric carbon giving two stereoisomers d and l, but only the l form has a biological positive effect, thus chiral recognition of l-carnitine enantiomers is extremely important in biological, chemical and pharmaceutical sciences. In order to get more insight into carnitine metabolism and synthesis, a sensitive analysis for the determination of the concentration of free carnitine, carnitine esters and the carnitine precursors is required. Carnitine has been investigated in many biochemical, pharmacokinetic, metabolic and toxicokinetic studies and thus many analytical methods have been developed and published for the determination of carnitine in foods, dietary supplements, pharmaceutical formulations, biological tissues and body fluid. The analytical procedures presented in this review have been validated in terms of basic parameters (linearity, limit of detection, limit of quantitation, sensitivity, accuracy, and precision). This article presented the impact of different analytical techniques, and provides an overview of applications that address a diverse array of pharmaceutical and biological questions and samples. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Systematically reviewing and synthesizing evidence from conversation analytic and related discursive research to inform healthcare communication practice and policy: an illustrated guide

    PubMed Central

    2013-01-01

    Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate

  16. Predictors of Bullying and Victimization in Childhood and Adolescence: A Meta-Analytic Investigation

    ERIC Educational Resources Information Center

    Cook, Clayton R.; Williams, Kirk R.; Guerra, Nancy G.; Kim, Tia E.; Sadek, Shelly

    2010-01-01

    Research on the predictors of 3 bully status groups (bullies, victims, and bully victims) for school-age children and adolescents was synthesized using meta-analytic procedures. The primary purpose was to determine the relative strength of individual and contextual predictors to identify targets for prevention and intervention. Age and how…

  17. Analytic variance estimates of Swank and Fano factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Benjamin; Badano, Aldo; Samuelson, Frank, E-mail: frank.samuelson@fda.hhs.gov

    Purpose: Variance estimates for detector energy resolution metrics can be used as stopping criteria in Monte Carlo simulations for the purpose of ensuring a small uncertainty of those metrics and for the design of variance reduction techniques. Methods: The authors derive an estimate for the variance of two energy resolution metrics, the Swank factor and the Fano factor, in terms of statistical moments that can be accumulated without significant computational overhead. The authors examine the accuracy of these two estimators and demonstrate how the estimates of the coefficient of variation of the Swank and Fano factors behave with data frommore » a Monte Carlo simulation of an indirect x-ray imaging detector. Results: The authors' analyses suggest that the accuracy of their variance estimators is appropriate for estimating the actual variances of the Swank and Fano factors for a variety of distributions of detector outputs. Conclusions: The variance estimators derived in this work provide a computationally convenient way to estimate the error or coefficient of variation of the Swank and Fano factors during Monte Carlo simulations of radiation imaging systems.« less

  18. Utility perspective on USEPA analytical methods program redirection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, B.; Davis, M.K.; Krasner, S.W.

    1996-11-01

    The Metropolitan Water District of Southern California (Metropolitan) is a public, municipal corporation, created by the State of California, which wholesales supplemental water trough 27 member agencies (cities and water districts). Metropolitan serves nearly 16 million people in an area along the coastal plain of Southern California that covers approximately 5200 square miles. Water deliveries have averaged up to 2.5 million acre-feet per year. Metropolitan`s Water Quality Laboratory (WQL) conducts compliance monitoring of its source and finished drinking waters for chemical and microbial constituents. The laboratory maintains certification of a large number and variety of analytical procedures. The WQL operatesmore » in a 17,000-square-foot facility. The equipment is state-of-the-art analytical instrumentation. The staff consists of 40 professional chemists and microbiologists whose experience and expertise are extensive and often highly specialized. The staff turnover is very low, and the laboratory is consistently, efficiently, and expertly run.« less

  19. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  20. Risk Factors for 30-Day Hospital Re-Admission with an Infectious Complication after Lower-Extremity Vascular Procedures.

    PubMed

    Melvin, Joseph C; Smith, Jamie B; Kruse, Robin L; Vogel, Todd R

    2017-04-01

    Lowering the 30-d re-admission rate after vascular surgery offers the potential to improve healthcare quality. This study evaluated re-admission associated with infections after open and endovascular lower extremity (LE) procedures for peripheral artery disease (PAD). Patients admitted for elective LE procedures for PAD were selected from the Cerner Health Facts ® database. Chi-square analysis evaluated the characteristics of the index admission associated with infection at 30-d re-admission. Multivariable logistic models were created to examine the association of patient and procedural characteristics with infections at re-admission. The microbiology data available at the time of re-admission were evaluated also. A total of 7,089 patients underwent elective LE procedures, of whom 770 (10.9%) were re-admitted within 30 d. A total of 289 (37.5%) had a diagnosis of infection during the re-admission. These infections included surgical site (14.8%), cellulitis (13.6%), sepsis (8.8%), urinary tract (4.9%), and pneumonia (4.9%). Index stay factors associated with infection at re-admission were fluid and electrolyte disorders, kidney disease, diabetes, previous infection, and chronic anemia. Laboratory results associated with an infection during re-admission were post-operative hemoglobin <8 g/dL, blood urea nitrogen >20 mg/dL, platelet counts >400 × 10 3 /mcL, glucose >180 mg/dL, and white blood cell count >11.0 × 10 3 /mcL. Adjusted models demonstrated longer stay, chronic anemia, previous infection, treatment at a teaching hospital, and hemoglobin <8 g/dL to be risk factors for re-admission with infection. Infective organisms isolated during the re-admission stay included Staphylococcus, Enterococcus, Escherichia, Pseudomonas, Proteus, and Klebsiella. Infectious complications were associated with more than one-third of all re-admissions after LE procedures. Predictors of re-admission within 30 d with an infectious complication were longer stay

  1. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  2. 21 CFR 320.29 - Analytical methods for an in vivo bioavailability or bioequivalence study.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Analytical methods for an in vivo bioavailability..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS FOR HUMAN USE BIOAVAILABILITY AND BIOEQUIVALENCE REQUIREMENTS Procedures for Determining the Bioavailability or Bioequivalence of Drug Products § 320.29...

  3. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  4. 3-D discrete analytical ridgelet transform.

    PubMed

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  5. Analytical simulation of weld effects in creep range

    NASA Technical Reports Server (NTRS)

    Dhalla, A. K.

    1985-01-01

    The inelastic analysis procedure used to investigate the effect of welding on the creep rupture strength of a typical Liquid Metal Fast Breeder Reactor (LMFBR) nozzle is discussed. The current study is part of an overall experimental and analytical investigation to verify the inelastic analysis procedure now being used to design LMFBR structural components operating at elevated temperatures. Two important weld effects included in the numerical analysis are: (1) the residual stress introduced in the fabrication process; and (2) the time-independent and the time-dependent material property variations. Finite element inelastic analysis was performed on a CRAY-1S computer using the ABAQUS program with the constitutive equations developed for the design of LMFBR structural components. The predicted peak weld residual stresses relax by as much as 40% during elevated temperature operation, and their effect on creep-rupture cracking of the nozzle is considered of secondary importance.

  6. Targeted analyte deconvolution and identification by four-way parallel factor analysis using three-dimensional gas chromatography with mass spectrometry data.

    PubMed

    Watson, Nathanial E; Prebihalo, Sarah E; Synovec, Robert E

    2017-08-29

    Comprehensive three-dimensional gas chromatography with time-of-flight mass spectrometry (GC 3 -TOFMS) creates an opportunity to explore a new paradigm in chemometric analysis. Using this newly described instrument and the well understood Parallel Factor Analysis (PARAFAC) model we present one option for utilization of the novel GC 3 -TOFMS data structure. We present a method which builds upon previous work in both GC 3 and targeted analysis using PARAFAC to simplify some of the implementation challenges previously discovered. Conceptualizing the GC 3 -TOFMS instead as a one-dimensional gas chromatograph with GC × GC-TOFMS detection we allow the instrument to create the PARAFAC target window natively. Each first dimension modulation thus creates a full GC × GC-TOFMS chromatogram fully amenable to PARAFAC. A simple mixture of 115 compounds and a diesel sample are interrogated through this methodology. All test analyte targets are successfully identified in both mixtures. In addition, mass spectral matching of the PARAFAC loadings to library spectra yielded results greater than 900 in 40 of 42 test analyte cases. Twenty-nine of these cases produced match values greater than 950. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    PubMed

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  8. Integrated Analytical Evaluation and Optimization of Model Parameters against Preprocessed Measurement Data

    DTIC Science & Technology

    1989-06-23

    Iterations .......................... 86 3.2 Comparison between MACH and POLAR ......................... 90 3.3 Flow Chart for VSTS Algorithm...The most recent changes are: a) development of the VSTS (velocity space topology search) algorithm for calculating particle densities b) extension...with simple analytic models. The largest modification of the MACH code was the implementation of the VSTS procedure, which constituted a complete

  9. A Meta-Analytic Review of Work-Family Conflict and Its Antecedents

    ERIC Educational Resources Information Center

    Byron, Kristin

    2005-01-01

    This meta-analytic review combines the results of more than 60 studies to help determine the relative effects of work, nonwork, and demographic and individual factors on work interference with family (WIF) and family interference with work (FIW). As expected, work factors related more strongly to WIF, and some nonwork factors were more strongly…

  10. Comparison of procedures for correction of matrix interferences in the analysis of soils by ICP-OES with CCD detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, D.A.; Sun, F.; Littlejohn, D.

    1995-12-31

    ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less

  11. Risk factors for surgical site infection following nonshunt pediatric neurosurgery: a review of 9296 procedures from a national database and comparison with a single-center experience

    PubMed Central

    Sherrod, Brandon A.; Arynchyna, Anastasia A.; Johnston, James M.; Rozzelle, Curtis J.; Blount, Jeffrey P.; Oakes, W. Jerry; Rocque, Brandon G.

    2017-01-01

    Objective Surgical site infection (SSI) following CSF shunt operations has been well studied, yet risk factors for nonshunt pediatric neurosurgery are less well understood. The purpose of this study was to determine SSI rates and risk factors following nonshunt pediatric neurosurgery using a nationwide patient cohort and an institutional dataset specifically for better understanding SSI. Methods The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program Pediatric (ACS NSQIP-P) database for the years 2012–2014, including all neurosurgical procedures performed on pediatric patients except CSF shunts and hematoma evacuations. SSI included deep (intracranial abscesses, meningitis, osteomyelitis, and ventriculitis) and superficial wound infections. The authors performed univariate analyses of SSI association with procedure, demographic, comorbidity, operative, and hospital variables, with subsequent multivariate logistic regression analysis to determine independent risk factors for SSI within 30 days of the index procedure. A similar analysis was performed using a detailed institutional infection database from Children’s Hospital of Alabama (COA). Results A total of 9296 nonshunt procedures were identified in NSQIP-P with an overall 30-day SSI rate of 2.7%. The 30-day SSI rate in the COA institutional database was similar (3.3% of 1103 procedures, p = 0.325). Postoperative time to SSI in NSQIP-P and COA was 14.6 ± 6.8 days and 14.8 ± 7.3 days, respectively (mean ± SD). Myelomeningocele (4.3% in NSQIP-P, 6.3% in COA), spine (3.5%, 4.9%), and epilepsy (3.4%, 3.1%) procedure categoriess had the highest SSI rates by procedure category in both NSQIP-P and COA. Independent SSI risk factors in NSQIP-P included postoperative pneumonia (OR 4.761, 95% CI 1.269–17.857, p = 0.021), immune disease/immunosuppressant use (OR 3.671, 95% CI 1.371–9.827, p = 0.010), cerebral palsy (OR 2.835, 95% CI 1.463–5.494, p = 0.002), emergency

  12. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES... to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater... times the standard deviation of replicate instrumental measurements of the analyte in reagent water. (c...

  13. Recently published analytical methods for determining alcohol in body materials : alcohol countermeasures literature review

    DOT National Transportation Integrated Search

    1974-10-01

    The author has brought the review of published analytical methods for determining alcohol in body materials up-to- date. The review deals with analytical methods for alcohol in blood and other body fluids and tissues; breath alcohol methods; factors ...

  14. Settlements around pumping wells: Analysis of influential factors and a simple calculation procedure

    NASA Astrophysics Data System (ADS)

    Pujades, Estanislao; De Simone, Silvia; Carrera, Jesus; Vázquez-Suñé, Enric; Jurado, Anna

    2017-05-01

    Estimated and measured settlements caused by pumping rarely agree. Several reasons could explain this mismatch, including the influence of layering, the mechanical parameters used in the predictions, or the relationship between settlements and drawdown. We analyze the influence of the above issues by investigating the mechanical response of pumped elastic porous media under different conditions. A radially symmetric conceptual model is considered and several hydro-mechanical simulations are performed varying the boundary conditions, the size of the modeled domain and the presence or not of an overlying layer. The simplicity of the considered problem allows us to compare our results with existing analytical solutions, to identify the role of each variable on pumping settlements and to generalize the results. The most relevant results are as follows: (1) settlements are proportional to drawdown only outside a circle of radius equal to 0.7 times the thickness of the pumped porous medium; inside, they are virtually constant, which leads to two simple procedures for computing pumping settlements. (2) Poorly conductive layers located above (or below) a pumped porous medium (with higher hydraulic conductivity) reduce and smooth settlements. (3) Boundary constraints affect the local specific storage coefficient and the displacements occurred. (4) The specific storage coefficient evaluated by interpreting pumping tests with the Cooper and Jacob method (1946) leads to overestimation of the actual Young's Modulus of the soil. The main conclusion is that settlements are less differential than expected near pumping wells. Still, they must always be evaluated acknowledging the nature of layering, the boundary constraints and carefully selecting the mechanical parameters of the soil.

  15. Analytical methods of the U.S. Geological Survey's New York District Water-Analysis Laboratory

    USGS Publications Warehouse

    Lawrence, Gregory B.; Lincoln, Tricia A.; Horan-Ross, Debra A.; Olson, Mark L.; Waldron, Laura A.

    1995-01-01

    The New York District of the U.S. Geological Survey (USGS) in Troy, N.Y., operates a water-analysis laboratory for USGS watershed-research projects in the Northeast that require analyses of precipitation and of dilute surface water and soil water for major ions; it also provides analyses of certain chemical constituents in soils and soil gas samples.This report presents the methods for chemical analyses of water samples, soil-water samples, and soil-gas samples collected in wateshed-research projects. The introduction describes the general materials and technicques for each method and explains the USGS quality-assurance program and data-management procedures; it also explains the use of cross reference to the three most commonly used methods manuals for analysis of dilute waters. The body of the report describes the analytical procedures for (1) solution analysis, (2) soil analysis, and (3) soil-gas analysis. The methods are presented in alphabetical order by constituent. The method for each constituent is preceded by (1) reference codes for pertinent sections of the three manuals mentioned above, (2) a list of the method's applications, and (3) a summary of the procedure. The methods section for each constitutent contains the following categories: instrumentation and equipment, sample preservation and storage, reagents and standards, analytical procedures, quality control, maintenance, interferences, safety considerations, and references. Sufficient information is presented for each method to allow the resulting data to be appropriately used in environmental investigations.

  16. Generalized Subset Designs in Analytical Chemistry.

    PubMed

    Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan

    2017-06-20

    Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.

  17. Separating method factors and higher order traits of the Big Five: a meta-analytic multitrait-multimethod approach.

    PubMed

    Chang, Luye; Connelly, Brian S; Geeza, Alexis A

    2012-02-01

    Though most personality researchers now recognize that ratings of the Big Five are not orthogonal, the field has been divided about whether these trait intercorrelations are substantive (i.e., driven by higher order factors) or artifactual (i.e., driven by correlated measurement error). We used a meta-analytic multitrait-multirater study to estimate trait correlations after common method variance was controlled. Our results indicated that common method variance substantially inflates trait correlations, and, once controlled, correlations among the Big Five became relatively modest. We then evaluated whether two different theories of higher order factors could account for the pattern of Big Five trait correlations. Our results did not support Rushton and colleagues' (Rushton & Irwing, 2008; Rushton et al., 2009) proposed general factor of personality, but Digman's (1997) α and β metatraits (relabeled by DeYoung, Peterson, and Higgins (2002) as Stability and Plasticity, respectively) produced viable fit. However, our models showed considerable overlap between Stability and Emotional Stability and between Plasticity and Extraversion, raising the question of whether these metatraits are redundant with their dominant Big Five traits. This pattern of findings was robust when we included only studies whose observers were intimately acquainted with targets. Our results underscore the importance of using a multirater approach to studying personality and the need to separate the causes and outcomes of higher order metatraits from those of the Big Five. We discussed the implications of these findings for the array of research fields in which personality is studied.

  18. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  19. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  20. Analytical method for the effects of the asteroid belt on planetary orbits

    NASA Technical Reports Server (NTRS)

    Mayo, A. P.

    1979-01-01

    Analytic expressions are derived for the perturbation of planetary orbits due to a thick constant-density asteroid belt. The derivations include extensions and adaptations of Plakhov's (1968) analytic expressions for the perturbations in five of the orbital elements for closed orbits around Saturn's rings. The equations of Plakhov are modified to include the effect of ring thickness, and additional equations are derived for the perturbations in the sixth orbital element, the mean anomaly. The gravitational potential and orbital perturbations are derived for the asteroid belt with and without thickness, and for a hoop approximation to the belt. The procedures are also applicable to Saturn's rings and the newly discovered rings of Uranus. The effects of the asteroid belt thickness on the gravitational potential coefficients and the orbital motions are demonstrated. Comparisons between the Mars orbital perturbations obtained by using the analytic expressions and those obtained by numerical integration are discussed. The effects of the asteroid belt on earth-based ranging to Mars are also demonstrated.

  1. Standard operating procedures, water immersion facility, revision B

    NASA Technical Reports Server (NTRS)

    1979-01-01

    General guideline procedures to identify those factors that are common to all spacecraft design laboratory support group emergency procedures and to establish the basic rescue plan are presented. This eliminates needless repetition of the fundamentals from the other, more specific procedures.

  2. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    PubMed

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  3. Human factors research plan for instrument procedures : FY12 version 1.1

    DOT National Transportation Integrated Search

    2012-06-19

    This research will support the development of instrument procedures for performance-based navigation (PBN) operations. These procedures include, but are not limited to, area navigation (RNAV) and required navigation performance (RNP) operations. The ...

  4. Analytic thinking reduces belief in conspiracy theories.

    PubMed

    Swami, Viren; Voracek, Martin; Stieger, Stefan; Tran, Ulrich S; Furnham, Adrian

    2014-12-01

    Belief in conspiracy theories has been associated with a range of negative health, civic, and social outcomes, requiring reliable methods of reducing such belief. Thinking dispositions have been highlighted as one possible factor associated with belief in conspiracy theories, but actual relationships have only been infrequently studied. In Study 1, we examined associations between belief in conspiracy theories and a range of measures of thinking dispositions in a British sample (N=990). Results indicated that a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking. In Studies 2-4, we examined the causational role played by analytic thinking in relation to conspiracist ideation. In Study 2 (N=112), we showed that a verbal fluency task that elicited analytic thinking reduced belief in conspiracy theories. In Study 3 (N=189), we found that an alternative method of eliciting analytic thinking, which related to cognitive disfluency, was effective at reducing conspiracist ideation in a student sample. In Study 4, we replicated the results of Study 3 among a general population sample (N=140) in relation to generic conspiracist ideation and belief in conspiracy theories about the July 7, 2005, bombings in London. Our results highlight the potential utility of supporting attempts to promote analytic thinking as a means of countering the widespread acceptance of conspiracy theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Differences in Coping Styles among Persons with Spinal Cord Injury: A Cluster-Analytic Approach.

    ERIC Educational Resources Information Center

    Frank, Robert G.; And Others

    1987-01-01

    Identified and validated two subgroups in group of 53 persons with spinal cord injury by applying cluster-analytic procedures to subjects' self-reported coping and health locus of control belief scores. Cluster 1 coped less effectively and tended to be psychologically distressed; Cluster 2 subjects emphasized internal health attributions and…

  6. Analytic solution for quasi-Lambertian radiation transfer.

    PubMed

    Braun, Avi; Gordon, Jeffrey M

    2010-02-10

    An analytic solution is derived for radiation transfer between flat quasi-Lambertian surfaces of arbitrary orientation, i.e., surfaces that radiate in a Lambertian fashion but within a numerical aperture smaller than unity. These formulas obviate the need for ray trace simulations and provide exact, physically transparent results. Illustrative examples that capture the salient features of the flux maps and the efficiency of flux transfer are presented for a few configurations of practical interest. There is also a fundamental reciprocity relation for quasi-Lambertian exchange, akin to the reciprocity theorem for fully Lambertian surfaces. Applications include optical fiber coupling, fiber-optic biomedical procedures, and solar concentrators.

  7. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  8. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  9. Enhancement in the sensitivity of microfluidic enzyme-linked immunosorbent assays through analyte preconcentration.

    PubMed

    Yanagisawa, Naoki; Dutta, Debashis

    2012-08-21

    In this Article, we describe a microfluidic enzyme-linked immunosorbent assay (ELISA) method whose sensitivity can be substantially enhanced through preconcentration of the target analyte around a semipermeable membrane. The reported preconcentration has been accomplished in our current work via electrokinetic means allowing a significant increase in the amount of captured analyte relative to nonspecific binding in the trapping/detection zone. Upon introduction of an enzyme substrate into this region, the rate of generation of the ELISA reaction product (resorufin) was observed to increase by over a factor of 200 for the sample and 2 for the corresponding blank compared to similar assays without analyte trapping. Interestingly, in spite of nonuniformities in the amount of captured analyte along the surface of our analysis channel, the measured fluorescence signal in the preconcentration zone increased linearly with time over an enzyme reaction period of 30 min and at a rate that was proportional to the analyte concentration in the bulk sample. In our current study, the reported technique has been shown to reduce the smallest detectable concentration of the tumor marker CA 19-9 and Blue Tongue Viral antibody by over 2 orders of magnitude compared to immunoassays without analyte preconcentration. When compared to microwell based ELISAs, the reported microfluidic approach not only yielded a similar improvement in the smallest detectable analyte concentration but also reduced the sample consumption in the assay by a factor of 20 (5 μL versus 100 μL).

  10. Analytical approximations to the dynamics of an array of coupled DC SQUIDs

    NASA Astrophysics Data System (ADS)

    Berggren, Susan; Palacios, Antonio

    2014-04-01

    Coupled dynamical systems that operate near the onset of a bifurcation can lead, under certain conditions, to strong signal amplification effects. Over the past years we have studied this generic feature on a wide range of systems, including: magnetic and electric fields sensors, gyroscopic devices, and arrays of loops of superconducting quantum interference devices, also known as SQUIDs. In this work, we consider an array of SQUID loops connected in series as a case study to derive asymptotic analytical approximations to the exact solutions through perturbation analysis. Two approaches are considered. First, a straightforward expansion in which the non-linear parameter related to the inductance of the DC SQUID is treated as the small perturbation parameter. Second, a more accurate procedure that considers the SQUID phase dynamics as non-uniform motion on a circle. This second procedure is readily extended to the series array and it could serve as a mathematical framework to find approximate solutions to related complex systems with high-dimensionality. To the best of our knowledge, an approximate analytical solutions to an array of SQUIDs has not been reported yet in the literature.

  11. Improving the analyte ion signal in matrix-assisted laser desorption/ionization imaging mass spectrometry via electrospray deposition by enhancing incorporation of the analyte in the matrix.

    PubMed

    Malys, Brian J; Owens, Kevin G

    2017-05-15

    Matrix-assisted laser desorption/ionization (MALDI) is widely used as the ionization method in high-resolution chemical imaging studies that seek to visualize the distribution of analytes within sectioned biological tissues. This work extends the use of electrospray deposition (ESD) to apply matrix with an additional solvent spray to incorporate and homogenize analyte within the matrix overlayer. Analytes and matrix are sequentially and independently applied by ESD to create a sample from which spectra are collected, mimicking a MALDI imaging mass spectrometry (IMS) experiment. Subsequently, an incorporation spray consisting of methanol is applied by ESD to the sample and another set of spectra are collected. The spectra prior to and after the incorporation spray are compared to evaluate the improvement in the analyte signal. Prior to the incorporation spray, samples prepared using α-cyano-4-hydroxycinnamic acid (CHCA) and 2,5-dihydroxybenzoic acid (DHB) as the matrix showed low signal while the sample using sinapinic acid (SA) initially exhibited good signal. Following the incorporation spray, the sample using SA did not show an increase in signal; the sample using DHB showed moderate gain factors of 2-5 (full ablation spectra) and 12-336 (raster spectra), while CHCA samples saw large increases in signal, with gain factors of 14-172 (full ablation spectra) and 148-1139 (raster spectra). The use of an incorporation spray to apply solvent by ESD to a matrix layer already deposited by ESD provides an increase in signal by both promoting incorporation of the analyte within and homogenizing the distribution of the incorporated analyte throughout the matrix layer. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Determination of Caffeine in Beverages by Capillary Zone Electrophoresis: An Experiment for the Undergraduate Analytical Laboratory

    NASA Astrophysics Data System (ADS)

    Conte, Eric D.; Barry, Eugene F.; Rubinstein, Harry

    1996-12-01

    Certain individuals may be sensitive to specific compounds in comsumer products. It is important to quantify these analytes in food products in order to monitor their intake. Caffeine is one such compound. Determination of caffeine in beverages by spectrophotometric procedures requires an extraction procedure, which can prove time-consuming. Although the corresponding determination by HPLC allows for a direct injection, capillary zone electrophoresis provides several advantages such as extremely low solvent consumption, smaller sample volume requirements, and improved sensitivity.

  13. Correcting for Indirect Range Restriction in Meta-Analysis: Testing a New Meta-Analytic Procedure

    ERIC Educational Resources Information Center

    Le, Huy; Schmidt, Frank L.

    2006-01-01

    Using computer simulation, the authors assessed the accuracy of J. E. Hunter, F. L. Schmidt, and H. Le's (2006) procedure for correcting for indirect range restriction, the most common type of range restriction, in comparison with the conventional practice of applying the Thorndike Case II correction for direct range restriction. Hunter et…

  14. Factors affecting successful colonoscopy procedures: Single-center experience.

    PubMed

    Kozan, Ramazan; Yılmaz, Tonguç Utku; Baştuğral, Uygar; Kerimoğlu, Umut; Yavuz, Yücel

    2018-01-01

    Colonoscopy is a gold standard procedure for several colon pathologies. Successful colonoscopy means demonstration of the ileocecal valve and determination of colon polyps. Here we aimed to evaluate our colonoscopy success and results. This retrospective descriptive study was performed in İstanbul Eren hospital endoscopy unit between 2012 and 2015. Colonoscopy results and patient demographics were obtained from the hospital database. All colonoscopy procedures were performed under general anesthesia and after full bowel preparation. In all, 870 patients were included to the study. We reached to the cecum in 850 (97.8%) patients. We were unable to reach the cecum in patients who were old and obese and those with previous lower abdominal operations. Angulation, inability to move forward, and tortuous colon were the reasons for inability to reach the cecum. Total 203 polyp samplings were performed in 139 patients. We performed 1, 2, and 3 polypectomies in 97, 28, and 10 patients, respectively. There were 29 (3.3%) colorectal cancers in our series. There was no mortality or morbidity in our study. General anesthesia and full bowel preparation may be the reason for increased success of colonoscopy. Increased experience and patient-endoscopist cooperation increased the rate of cecum access and polyp resection and decreased the complication rate.

  15. Annual banned-substance review: analytical approaches in human sports drug testing.

    PubMed

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Proportion of general factor variance in a hierarchical multiple-component measuring instrument: a note on a confidence interval estimation procedure.

    PubMed

    Raykov, Tenko; Zinbarg, Richard E

    2011-05-01

    A confidence interval construction procedure for the proportion of explained variance by a hierarchical, general factor in a multi-component measuring instrument is outlined. The method provides point and interval estimates for the proportion of total scale score variance that is accounted for by the general factor, which could be viewed as common to all components. The approach may also be used for testing composite (one-tailed) or simple hypotheses about this proportion, and is illustrated with a pair of examples. ©2010 The British Psychological Society.

  17. Semi-analytical Model for Estimating Absorption Coefficients of Optically Active Constituents in Coastal Waters

    NASA Astrophysics Data System (ADS)

    Wang, D.; Cui, Y.

    2015-12-01

    The objectives of this paper are to validate the applicability of a multi-band quasi-analytical algorithm (QAA) in retrieval absorption coefficients of optically active constituents in turbid coastal waters, and to further improve the model using a proposed semi-analytical model (SAA). The ap(531) and ag(531) semi-analytically derived using SAA model are quite different from the retrievals procedures of QAA model that ap(531) and ag(531) are semi-analytically derived from the empirical retrievals results of a(531) and a(551). The two models are calibrated and evaluated against datasets taken from 19 independent cruises in West Florida Shelf in 1999-2003, provided by SeaBASS. The results indicate that the SAA model produces a superior performance to QAA model in absorption retrieval. Using of the SAA model in retrieving absorption coefficients of optically active constituents from West Florida Shelf decreases the random uncertainty of estimation by >23.05% from the QAA model. This study demonstrates the potential of the SAA model in absorption coefficients of optically active constituents estimating even in turbid coastal waters. Keywords: Remote sensing; Coastal Water; Absorption Coefficient; Semi-analytical Model

  18. Post-procedure bleeding in interventional radiology.

    PubMed

    Mayer, J; Tacher, V; Novelli, L; Djabbari, M; You, K; Chiaradia, M; Deux, J-F; Kobeiter, H

    2015-01-01

    Following interventional radiology procedures, bleeding can occur in 0.5 to 4% of the cases. Risk factors are related to the patient, to the procedure, and to the end organ. Bleeding is treated usually by interventional radiologists and consists mainly of embolization. Bleeding complications are preventable: before the procedure by checking hemostasis, during the procedure by ensuring the accurate puncture site (with ultrasound or fluoroscopy guidance) or by treating the puncture path using gelatin sponge, curaspon(®), biological glue or thermocoagulation, and after the procedure by carefully monitoring the patients. Copyright © 2015 Éditions françaises de radiologie. Published by Elsevier Masson SAS. All rights reserved.

  19. Nanoscaled aptasensors for multi-analyte sensing

    PubMed Central

    Saberian-Borujeni, Mehdi; Johari-Ahar, Mohammad; Hamzeiy, Hossein; Barar, Jaleh; Omidi, Yadollah

    2014-01-01

    Introduction: Nanoscaled aptamers (Aps), as short single-stranded DNA or RNA oligonucleotides, are able to bind to their specific targets with high affinity, upon which they are considered as powerful diagnostic and analytical sensing tools (the so-called "aptasensors"). Aptamers are selected from a random pool of oligonucleotides through a procedure known as "systematic evolution of ligands by exponential enrichment". Methods: In this work, the most recent studies in the field of aptasensors are reviewed and discussed with a main focus on the potential of aptasensors for the multianalyte detection(s). Results: Due to the specific folding capability of aptamers in the presence of analyte, aptasensors have substantially successfully been exploited for the detection of a wide range of small and large molecules (e.g., drugs and their metabolites, toxins, and associated biomarkers in various diseases) at very low concentrations in the biological fluids/samples even in presence of interfering species. Conclusion: Biological samples are generally considered as complexes in the real biological media. Hence, the development of aptasensors with capability to determine various targets simultaneously within a biological matrix seems to be our main challenge. To this end, integration of various key scientific dominions such as bioengineering and systems biology with biomedical researches are inevitable. PMID:25671177

  20. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    NASA Astrophysics Data System (ADS)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  1. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  2. ASVCP quality assurance guidelines: control of preanalytical and analytical factors for hematology for mammalian and nonmammalian species, hemostasis, and crossmatching in veterinary laboratories.

    PubMed

    Vap, Linda M; Harr, Kendal E; Arnold, Jill E; Freeman, Kathleen P; Getzy, Karen; Lester, Sally; Friedrichs, Kristen R

    2012-03-01

    In December 2009, the American Society for Veterinary Clinical Pathology (ASVCP) Quality Assurance and Laboratory Standards committee published the updated and peer-reviewed ASVCP Quality Assurance Guidelines on the Society's website. These guidelines are intended for use by veterinary diagnostic laboratories and veterinary research laboratories that are not covered by the US Food and Drug Administration Good Laboratory Practice standards (Code of Federal Regulations Title 21, Chapter 58). The guidelines have been divided into 3 reports: (1) general analytical factors for veterinary laboratory performance and comparisons; (2) hematology, hemostasis, and crossmatching; and (3) clinical chemistry, cytology, and urinalysis. This particular report is one of 3 reports and provides recommendations for control of preanalytical and analytical factors related to hematology for mammalian and nonmammalian species, hemostasis testing, and crossmatching and is adapted from sections 1.1 and 2.3 (mammalian hematology), 1.2 and 2.4 (nonmammalian hematology), 1.5 and 2.7 (hemostasis testing), and 1.6 and 2.8 (crossmatching) of the complete guidelines. These guidelines are not intended to be all-inclusive; rather, they provide minimal guidelines for quality assurance and quality control for veterinary laboratory testing and a basis for laboratories to assess their current practices, determine areas for improvement, and guide continuing professional development and education efforts. © 2012 American Society for Veterinary Clinical Pathology.

  3. Analytical and experimental study of vibrations in a gear transmission

    NASA Technical Reports Server (NTRS)

    Choy, F. K.; Ruan, Y. F.; Zakrajsek, J. J.; Oswald, Fred B.; Coy, J. J.

    1991-01-01

    An analytical simulation of the dynamics of a gear transmission system is presented and compared to experimental results from a gear noise test rig at the NASA Lewis Research Center. The analytical procedure developed couples the dynamic behaviors of the rotor-bearing-gear system with the response of the gearbox structure. The modal synthesis method is used in solving the overall dynamics of the system. Locally each rotor-gear stage is modeled as an individual rotor-bearing system using the matrix transfer technique. The dynamics of each individual rotor are coupled with other rotor stages through the nonlinear gear mesh forces and with the gearbox structure through bearing support systems. The modal characteristics of the gearbox structure are evaluated using the finite element procedure. A variable time steping integration routine is used to calculate the overall time transient behavior of the system in modal coordinates. The global dynamic behavior of the system is expressed in a generalized coordinate system. Transient and steady state vibrations of the gearbox system are presented in the time and frequency domains. The vibration characteristics of a simple single mesh gear noise test rig is modeled. The numerical simulations are compared to experimental data measured under typical operating conditions. The comparison of system natural frequencies, peak vibration amplitudes, and gear mesh frequencies are generally in good agreement.

  4. A Finite Element Procedure for Calculating Fluid-Structure Interaction Using MSC/NASTRAN

    NASA Technical Reports Server (NTRS)

    Chargin, Mladen; Gartmeier, Otto

    1990-01-01

    This report is intended to serve two purposes. The first is to present a survey of the theoretical background of the dynamic interaction between a non-viscid, compressible fluid and an elastic structure is presented. Section one presents a short survey of the application of the finite element method (FEM) to the area of fluid-structure-interaction (FSI). Section two describes the mathematical foundation of the structure and fluid with special emphasis on the fluid. The main steps in establishing the finite element (FE) equations for the fluid structure coupling are discussed in section three. The second purpose is to demonstrate the application of MSC/NASTRAN to the solution of FSI problems. Some specific topics, such as fluid structure analogy, acoustic absorption, and acoustic contribution analysis are described in section four. Section five deals with the organization of the acoustic procedure flowchart. Section six includes the most important information that a user needs for applying the acoustic procedure to practical FSI problems. Beginning with some rules concerning the FE modeling of the coupled system, the NASTRAN USER DECKs for the different steps are described. The goal of section seven is to demonstrate the use of the acoustic procedure with some examples. This demonstration includes an analytic verification of selected FE results. The analytical description considers only some aspects of FSI and is not intended to be mathematically complete. Finally, section 8 presents an application of the acoustic procedure to vehicle interior acoustic analysis with selected results.

  5. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    PubMed

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Analytical validation of soluble fms-like tyrosine and placental growth factor assays on B·R·A·H·M·S KRYPTOR Compact Plus automated immunoassay platform.

    PubMed

    Chan, Siaw Li; Rana, Sarosh; Chinthala, Sireesha; Salahuddin, Saira; Yeo, Kiang-Teck J

    2018-01-01

    Preeclampsia is one of the leading hypertensive disorders of pregnancy. Angiogenic biomarkers such as anti-angiogenic factor soluble fms-like tyrosine kinase 1 (sFlt1) and pro-angiogenic factor placental growth factor (PlGF) are involved in the pathophysiology of preeclampsia. The aim of this study is to validate the analytical performance of sFlt1 and PlGF on the B·R·A·H·M·S KRYPTOR Compact Plus (ThermoFisher Scientific). We examined K 2 -EDTA plasma samples from 50 patients on B·R·A·H·M·S KRYPTOR Compact Plus, an automated immunoassay platform. QC materials were used to assess intra- and inter-precision of the assay. Lower limit of quantitation and interference studies were determined using pooled patient plasma. The sFlt1 and PlGF assays demonstrated an analytical measuring range of 90-69,000 pg/mL and 11-7000 pg/mL, respectively (r 2  > 0.99). Lower limit of quantitation (20% CV) was interpolated to be 35 pg/mL for sFlt1 and 10 pg/mL for PlGF. Total precision for both assay displayed CVs of <10%. Interference studies showed that both assays were not significantly affected by hemolysis up to an H-index of 1100 for sFlt1 and 300 for PlGF; L- and I-index of 800 and 80 respectively for both assays. The Passing-Bablok regression analysis for sFlt1/PlGF yielded an equation of y = 1.05x + 0.02, and the Bland Altman analysis showed an average bias of 0.84. Plasma levels of sFlt1 and PlGF measured on the B·R·A·H·M·S KRYPTOR Compact Plus platform demonstrate excellent analytical performance and are acceptable as clinical grade assays. Copyright © 2018 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.

  7. Analytical one-dimensional model for laser-induced ultrasound in planar optically absorbing layer.

    PubMed

    Svanström, Erika; Linder, Tomas; Löfqvist, Torbjörn

    2014-03-01

    Ultrasound generated by means of laser-based photoacoustic principles are in common use today and applications can be found both in biomedical diagnostics, non-destructive testing and materials characterisation. For certain measurement applications it could be beneficial to shape the generated ultrasound regarding spectral properties and temporal profile. To address this, we studied the generation and propagation of laser-induced ultrasound in a planar, layered structure. We derived an analytical expression for the induced pressure wave, including different physical and optical properties of each layer. A Laplace transform approach was employed in analytically solving the resulting set of photoacoustic wave equations. The results correspond to simulations and were compared to experimental results. To enable the comparison between recorded voltage from the experiments and the calculated pressure we employed a system identification procedure based on physical properties of the ultrasonic transducer to convert the calculated acoustic pressure to voltages. We found reasonable agreement between experimentally obtained voltages and the voltages determined from the calculated acoustic pressure, for the samples studied. The system identification procedure was found to be unstable, however, possibly from violations of material isotropy assumptions by film adhesives and coatings in the experiment. The presented analytical model can serve as a basis when addressing the inverse problem of shaping an acoustic pulse from absorption of a laser pulse in a planar layered structure of elastic materials. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Building analytical three-field cosmological models

    NASA Astrophysics Data System (ADS)

    Santos, J. R. L.; Moraes, P. H. R. S.; Ferreira, D. A.; Neta, D. C. Vilar

    2018-02-01

    A difficult task to deal with is the analytical treatment of models composed of three real scalar fields, as their equations of motion are in general coupled and hard to integrate. In order to overcome this problem we introduce a methodology to construct three-field models based on the so-called "extension method". The fundamental idea of the procedure is to combine three one-field systems in a non-trivial way, to construct an effective three scalar field model. An interesting scenario where the method can be implemented is with inflationary models, where the Einstein-Hilbert Lagrangian is coupled with the scalar field Lagrangian. We exemplify how a new model constructed from our method can lead to non-trivial behaviors for cosmological parameters.

  9. Evaluation of the availability of bound analyte for passive sampling in the presence of mobile binding matrix.

    PubMed

    Xu, Jianqiao; Huang, Shuyao; Jiang, Ruifen; Cui, Shufen; Luan, Tiangang; Chen, Guosheng; Qiu, Junlang; Cao, Chenyang; Zhu, Fang; Ouyang, Gangfeng

    2016-04-21

    Elucidating the availability of the bound analytes for the mass transfer through the diffusion boundary layers (DBLs) adjacent to passive samplers is important for understanding the passive sampling kinetics in complex samples, in which the lability factor of the bound analyte in the DBL is an important parameter. In this study, the mathematical expression of lability factor was deduced by assuming a pseudo-steady state during passive sampling, and the equation indicated that the lability factor was equal to the ratio of normalized concentration gradients between the bound and free analytes. Through the introduction of the mathematical expression of lability factor, the modified effective average diffusion coefficient was proven to be more suitable for describing the passive sampling kinetics in the presence of mobile binding matrixes. Thereafter, the lability factors of the bound polycyclic aromatic hydrocarbons (PAHs) with sodium dodecylsulphate (SDS) micelles as the binding matrixes were figured out according to the improved theory. The lability factors were observed to decrease with larger binding ratios and smaller micelle sizes, and were successfully used to predict the mass transfer efficiencies of PAHs through DBLs. This study would promote the understanding of the availability of bound analytes for passive sampling based on the theoretical improvements and experimental assessments. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  11. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  12. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  13. Culturally Sensitive Interventions and Substance Use: A Meta-Analytic Review of Outcomes among Minority Youths

    ERIC Educational Resources Information Center

    Hodge, David R.; Jackson, Kelly F.; Vaughn, Michael G.

    2012-01-01

    This study assessed the effectiveness of culturally sensitive interventions (CSIs) ("N" = 10) designed to address substance use among minority youths. Study methods consisted of systematic search procedures, quality of study ratings, and meta-analytic techniques to gauge effects and evaluate publication bias. The results, across all measures and…

  14. Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.

    PubMed

    Valcárcel, Miguel

    2017-11-07

    This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.

  15. Seamless Digital Environment – Data Analytics Use Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less

  16. Rates and risk factors of unplanned 30-day readmission following general and thoracic pediatric surgical procedures.

    PubMed

    Polites, Stephanie F; Potter, Donald D; Glasgow, Amy E; Klinkner, Denise B; Moir, Christopher R; Ishitani, Michael B; Habermann, Elizabeth B

    2017-08-01

    Postoperative unplanned readmissions are costly and decrease patient satisfaction; however, little is known about this complication in pediatric surgery. The purpose of this study was to determine rates and predictors of unplanned readmission in a multi-institutional cohort of pediatric surgical patients. Unplanned 30-day readmissions following general and thoracic surgical procedures in children <18 were identified from the 2012-2014 National Surgical Quality Improvement Program- Pediatric. Time-dependent rates of readmission per 30 person-days were determined to account for varied postoperative length of stay (pLOS). Patients were randomly divided into 70% derivation and 30% validation cohorts which were used for creation and validation of a risk model for readmission. Readmission occurred in 1948 (3.6%) of 54,870 children for a rate of 4.3% per 30 person-days. Adjusted predictors of readmission included hepatobiliary procedures, increased wound class, operative duration, complications, and pLOS. The predictive model discriminated well in the derivation and validation cohorts (AUROC 0.710 and 0.701) with good calibration between observed and expected readmission events in both cohorts (p>.05). Unplanned readmission occurs less frequently in pediatric surgery than what is described in adults, calling into question its use as a quality indicator in this population. Factors that predict readmission including type of procedure, complications, and pLOS can be used to identify at-risk children and develop prevention strategies. III. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Enhanced analytical sensitivity of a quantitative PCR for CMV using a modified nucleic-acid extraction procedure.

    PubMed

    Ferreira-Gonzalez, A; Yanovich, S; Langley, M R; Weymouth, L A; Wilkinson, D S; Garrett, C T

    2000-01-01

    Accurate and rapid diagnosis of CMV disease in immunocompromised individuals remains a challenge. Quantitative polymerase chain reaction (QPCR) methods for detection of CMV in peripheral blood mononuclear cells (PBMC) have improved the positive and negative predictive value of PCR for diagnosis of CMV disease. However, detection of CMV in plasma has demonstrated a lower negative predictive value for plasma as compared with PBMC. To enhance the sensitivity of the QPCR assay for plasma specimens, plasma samples were centrifuged before nucleic-acid extraction and the extracted DNA resolubilized in reduced volume. Optimization of the nucleic-acid extraction focused on decreasing or eliminating the presence of inhibitors in the pelleted plasma. Quantitation was achieved by co-amplifying an internal quantitative standard (IS) with the same primer sequences as CMV. PCR products were detected by hybridization in a 96-well microtiter plate coated with a CMV or IS specific probe. The precision of the QPCR assay for samples prepared from untreated and from pelleted plasma was then assessed. The coefficient of variation for both types of samples was almost identical and the magnitude of the coefficient of variations was reduced by a factor of ten if the data were log transformed. Linearity of the QPCR assay extended over a 3.3-log range for both types of samples but the range of linearity for pelleted plasma was 20 to 40,000 viral copies/ml (vc/ml) in contrast to 300 to 400,000 vc/ml for plasma. Thus, centrifugation of plasma before nucleic-acid extraction and resuspension of extracted CMV DNA in reduced volume enhanced the analytical sensitivity approximately tenfold over the dynamic range of the assay. Copyright 2000 Wiley-Liss, Inc.

  18. 40 CFR 53.33 - Test Procedure for Methods for Lead (Pb).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... analytical procedure than specified in 40 CFR Appendix G, may be tested by analyzing pairs of filter strips... Appendix Q, requires the use of two PM 10 reference samplers because a single 46.2-mm filter from a reference sampler may not be divided prior to analysis. It is possible to analyze a 46.2-mm filter first...

  19. 40 CFR 53.33 - Test Procedure for Methods for Lead (Pb).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... analytical procedure than specified in 40 CFR Appendix G, may be tested by analyzing pairs of filter strips... Appendix Q, requires the use of two PM 10 reference samplers because a single 46.2-mm filter from a reference sampler may not be divided prior to analysis. It is possible to analyze a 46.2-mm filter first...

  20. "In situ" extraction of essential oils by use of Dean-Stark glassware and a Vigreux column inside a microwave oven: a procedure for teaching green analytical chemistry.

    PubMed

    Chemat, Farid; Perino-Issartier, Sandrine; Petitcolas, Emmanuel; Fernandez, Xavier

    2012-08-01

    One of the principal objectives of sustainable and green processing development remains the dissemination and teaching of green chemistry in colleges, high schools, and academic laboratories. This paper describes simple glassware that illustrates the phenomenon of extraction in a conventional microwave oven as energy source and a process for green analytical chemistry. Simple glassware comprising a Dean-Stark apparatus (for extraction of aromatic plant material and recovery of essential oils and distilled water) and a Vigreux column (as an air-cooled condenser inside the microwave oven) was designed as an in-situ extraction vessel inside a microwave oven. The efficiency of this experiment was validated for extraction of essential oils from 30 g fresh orange peel, a by-product in the production of orange juice. Every laboratory throughout the world can use this equipment. The microwave power is 100 W and the irradiation time 15 min. The method is performed at atmospheric pressure without added solvent or water and furnishes essential oils similar to those obtained by conventional hydro or steam distillation. By use of GC-MS, 22 compounds in orange peel were separated and identified; the main compounds were limonene (72.1%), β-pinene (8.4%), and γ-terpinene (6.9%). This procedure is appropriate for the teaching laboratory, does not require any special microwave equipment, and enables the students to learn the skills of extraction, and chromatographic and spectroscopic analysis. They are also exposed to a dramatic visual example of rapid, sustainable, and green extraction of an essential oil, and are introduced to successful sustainable and green analytical chemistry.

  1. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  2. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  3. Halogenated hydrocarbon pesticides and other volatile organic contaminants provide analytical challenges in global trading.

    PubMed

    Budnik, Lygia T; Fahrenholtz, Svea; Kloth, Stefan; Baur, Xaver

    2010-04-01

    Protection against infestation of a container cargo by alien species is achieved by mandatory fumigation with pesticides. Most of the effective fumigants are methyl and ethyl halide gases that are highly toxic and are a risk to both human health and the environment. There is a worldwide need for a reliable and robust analytical screening procedure for these volatile chemicals in a multitude of health and environmental scenarios. We have established a highly sensitive broad spectrum mass spectrometry method combined with thermal desorption gas chromatography to detect, identify and quantify volatile pesticide residues. Using this method, 1201 random ambient air samples taken from freight containers arriving at the biggest European ports of Hamburg and Rotterdam were analyzed over a period of two and a half years. This analytical procedure is a valuable strategy to measure air pollution from these hazardous chemicals, to help in the identification of pesticides in the new mixtures/formulations that are being adopted globally and to analyze expired breath samples after suspected intoxication in biomonitoring.

  4. Influence of a strong sample solvent on analyte dispersion in chromatographic columns.

    PubMed

    Mishra, Manoranjan; Rana, Chinar; De Wit, A; Martin, Michel

    2013-07-05

    In chromatographic columns, when the eluting strength of the sample solvent is larger than that of the carrier liquid, a deformation of the analyte zone occurs because its frontal part moves at a relatively high velocity due to a low retention factor in the sample solvent while the rear part of the analyte zone is more retained in the carrier liquid and hence moves at a lower velocity. The influence of this solvent strength effect on the separation of analytes is studied here theoretically using a mass balance model describing the spatio-temporal evolution of the eluent, the sample solvent and the analyte. The viscosity of the sample solvent and carrier fluid is supposed to be the same (i.e. no viscous fingering effects are taken into account). A linear isotherm adsorption with a retention factor depending upon the local concentration of the liquid phase is considered. The governing equations are numerically solved by using a Fourier spectral method and parametric studies are performed to analyze the effect of various governing parameters on the dispersion and skewness of the analyte zone. The distortion of this zone is found to depend strongly on the difference in eluting strength between the mobile phase and the sample solvent as well as on the sample volume. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Violent video game effects on aggression, empathy, and prosocial behavior in eastern and western countries: a meta-analytic review.

    PubMed

    Anderson, Craig A; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L; Bushman, Brad J; Sakamoto, Akira; Rothstein, Hannah R; Saleem, Muniba

    2010-03-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past meta-analyses; (b) cross-cultural comparisons; (c) longitudinal studies for all outcomes except physiological arousal; (d) conservative statistical controls; (e) multiple moderator analyses; and (f) sensitivity analyses. Social-cognitive models and cultural differences between Japan and Western countries were used to generate theory-based predictions. Meta-analyses yielded significant effects for all 6 outcome variables. The pattern of results for different outcomes and research designs (experimental, cross-sectional, longitudinal) fit theoretical predictions well. The evidence strongly suggests that exposure to violent video games is a causal risk factor for increased aggressive behavior, aggressive cognition, and aggressive affect and for decreased empathy and prosocial behavior. Moderator analyses revealed significant research design effects, weak evidence of cultural differences in susceptibility and type of measurement effects, and no evidence of sex differences in susceptibility. Results of various sensitivity analyses revealed these effects to be robust, with little evidence of selection (publication) bias.

  6. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  7. An analytical procedure for the determination of aluminum used in antiperspirants on human skin in Franz™ diffusion cell.

    PubMed

    Guillard, Olivier; Fauconneau, Bernard; Favreau, Frédéric; Marrauld, Annie; Pineau, Alain

    2012-04-01

    A local case report of hyperaluminemia (aluminum concentration: 3.88 µmol/L) in a woman using an aluminum-containing antiperspirant for 4 years raises the question of possible transdermal uptake of aluminum salt as a future public health problem. Prior to studying the transdermal uptake of three commercialized cosmetic formulas, an analytical assay of aluminum (Al) in chlorohydrate form (ACH) by Zeeman Electrothermal Atomic Absorption Spectrophotometer (ZEAAS) in a clean room was optimized and validated. This analysis was performed with different media on human skin using a Franz(™) diffusion cell. The detection and quantification limits were set at ≤ 3 µg/L. Precision analysis as within-run (n = 12) and between-run (n = 15-68 days) yield CV ≤ 6%. The high analytic sensitivity (2-3 µg/L) and low variability should allow an in vitro study of the transdermal uptake of ACH.

  8. An analytical model for scanning electron microscope Type I magnetic contrast with energy filtering

    NASA Astrophysics Data System (ADS)

    Chim, W. K.

    1994-02-01

    In this article, a theoretical model for type I magnetic contrast calculations in the scanning electron microscope with energy filtering is presented. This model uses an approximate form of the secondary electron (SE) energy distribution by Chung and Everhart [M. S. Chung and T. E. Everhart, J. Appl. Phys. 45, 707 (1974). Closed form analytical expressions for the contrast and quality factors, which take into consideration the work function and field-distance integral of the material being studied, are obtained. This analytical model is compared with that of a more accurate numerical model. Results showed that the contrast and quality factors for the analytical model differed by not more than 20% from the numerical model, with the actual difference depending on the range of filtered SE energies considered. This model has also been extended to the situation of a two-detector (i.e., detector A and B) configuration, in which enhanced magnetic contrast and quality factor can be obtained by operating in the ``A-B'' mode.

  9. Determining an Effective Intervention within a Brief Experimental Analysis for Reading: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Burns, Matthew K.; Wagner, Dana

    2008-01-01

    The current study applied meta-analytic procedures to brief experimental analysis research of reading fluency interventions to better inform practice and suggest areas for future research. Thirteen studies were examined to determine what magnitude of effect was needed to identify an intervention as the most effective within a brief experimental…

  10. Magnetic scavengers as carriers of analytes for flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS).

    PubMed

    Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz

    2015-09-07

    In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.

  11. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  12. Semi-Analytic Reconstruction of Flux in Finite Volume Formulations

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2006-01-01

    Semi-analytic reconstruction uses the analytic solution to a second-order, steady, ordinary differential equation (ODE) to simultaneously evaluate the convective and diffusive flux at all interfaces of a finite volume formulation. The second-order ODE is itself a linearized approximation to the governing first- and second- order partial differential equation conservation laws. Thus, semi-analytic reconstruction defines a family of formulations for finite volume interface fluxes using analytic solutions to approximating equations. Limiters are not applied in a conventional sense; rather, diffusivity is adjusted in the vicinity of changes in sign of eigenvalues in order to achieve a sufficiently small cell Reynolds number in the analytic formulation across critical points. Several approaches for application of semi-analytic reconstruction for the solution of one-dimensional scalar equations are introduced. Results are compared with exact analytic solutions to Burger s Equation as well as a conventional, upwind discretization using Roe s method. One approach, the end-point wave speed (EPWS) approximation, is further developed for more complex applications. One-dimensional vector equations are tested on a quasi one-dimensional nozzle application. The EPWS algorithm has a more compact difference stencil than Roe s algorithm but reconstruction time is approximately a factor of four larger than for Roe. Though both are second-order accurate schemes, Roe s method approaches a grid converged solution with fewer grid points. Reconstruction of flux in the context of multi-dimensional, vector conservation laws including effects of thermochemical nonequilibrium in the Navier-Stokes equations is developed.

  13. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    USGS Publications Warehouse

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  14. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    PubMed

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  15. The procedures manual of the Environmental Measurements Laboratory. Volume 2, 28. edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chieco, N.A.

    1997-02-01

    This report contains environmental sampling and analytical chemistry procedures that are performed by the Environmental Measurements Laboratory. The purpose of environmental sampling and analysis is to obtain data that describe a particular site at a specific point in time from which an evaluation can be made as a basis for possible action.

  16. The procedures manual of the Environmental Measurements Laboratory. Volume 1, 28. edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chieco, N.A.

    1997-02-01

    This manual covers procedures and technology currently in use at the Environmental Measurements Laboratory. An attempt is made to be sure that all work carried out will be of the highest quality. Attention is focused on the following areas: quality assurance; sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications.

  17. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs

  18. A simple, analytical, axisymmetric microburst model for downdraft estimation

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.

    1991-01-01

    A simple analytical microburst model was developed for use in estimating vertical winds from horizontal wind measurements. It is an axisymmetric, steady state model that uses shaping functions to satisfy the mass continuity equation and simulate boundary layer effects. The model is defined through four model variables: the radius and altitude of the maximum horizontal wind, a shaping function variable, and a scale factor. The model closely agrees with a high fidelity analytical model and measured data, particularily in the radial direction and at lower altitudes. At higher altitudes, the model tends to overestimate the wind magnitude relative to the measured data.

  19. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  20. Fluorescence in the system Eu(III) - oxytetracycline - co-ligand -sodium dodecylbenzene sulphonate micelles and its analytical application

    NASA Astrophysics Data System (ADS)

    Shtykov, Sergei N.; Smirnova, Tatyana D.; Kalashnikova, Natalja V.; Bylinkin, Yurii G.; Zhemerichkin, Dmitry A.

    2006-07-01

    Fluorescence enhancement of the Eu 3+ - oxytetracycline (OTC) chelate by addition of phenanthroline (Phen) and trioctyiphosphine oxide (TOPO) as well as micelles of anionic, catiomic and nonionic surfactants has been studied. As was found, in the presence of Phen as co-ligand and micelles of dodecylbenzene sulfonate as anionic surfactant the analytical signal increased by a factor of 8.5 and reached maximum value at pH 8.0 +/- 0.5. The dynamic concentration range of OTC determination was found to be 8.0 x 10 -8 - 4.0 × 10 -5 M (R2 = 0.991) and the detection limit 5.3 × 10 -8 M (3 σ criterion). The procedure based on europium-sensitized fluorescence has been developed for the determination of OTC in chicken meat with the recovery of 98.0-103.3%.

  1. A paper-based analytical device for the determination of hydrogen sulfide in fuel oils based on headspace liquid-phase microextraction and cyclic voltammetry.

    PubMed

    Nechaeva, Daria; Shishov, Andrey; Ermakov, Sergey; Bulatov, Andrey

    2018-06-01

    An easily performed miniaturized, cheap, selective and sensitive procedure for the determination of H 2 S in fuel oil samples based on a headspace liquid-phase microextraction followed by a cyclic voltammetry detection using a paper-based analytical device (PAD) was developed. A modified wax dipping method was applied to fabricate the PAD. The PAD included hydrophobic zones of sample and supporting electrolyte connecting by hydrophilic channel. The zones of sample and supporting electrolyte were connected with nickel working, platinum auxiliary and Ag/AgCl reference electrodes. The analytical procedure included separation of H 2 S from fuel oil sample based on the headspace liquid-phase microextraction in alkaline solution. Then, sulfide ions solution obtained and supporting electrolyte were dropped on the zones followed by analyte detection at + 0.45 V. Under the optimized conditions, H 2 S concentration in the range from 2 to 20 mg kg -1 had a good linear relation with the peak current. The limit of detection (3σ) was 0.6 mg kg -1 . The procedure was successfully applied to the analysis of fuel oil samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    PubMed

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  3. Cleaning procedure for improved photothermal background of toroidal optical microresonators

    NASA Astrophysics Data System (ADS)

    Horak, Erik H.; Knapper, Kassandra A.; Heylman, Kevin D.; Goldsmith, Randall H.

    2016-09-01

    High Q-factors and small mode volumes have made toroidal optical microresonators exquisite sensors to small shifts in the effective refractive index of the WGM modes. Eliminating contaminants and improving quality factors is key for many different sensing techniques, and is particularly important for photothermal imaging as contaminants add photothermal background obscuring objects of interest. Several different cleaning procedures including wet- and dry-chemical procedures are tested for their effect on Q-factors and photothermal background. RCA cleaning was shown to be successful in contrast to previously described acid cleaning procedures, most likely due to the different surface reactivity of the acid reagents used. UV-ozone cleaning was shown to be vastly superior to O2 plasma cleaning procedures, significantly reducing the photothermal background of the resonator.

  4. Identification and quantification of carbamate pesticides in dried lime tree flowers by means of excitation-emission molecular fluorescence and parallel factor analysis when quenching effect exists.

    PubMed

    Rubio, L; Ortiz, M C; Sarabia, L A

    2014-04-11

    A non-separative, fast and inexpensive spectrofluorimetric method based on the second order calibration of excitation-emission fluorescence matrices (EEMs) was proposed for the determination of carbaryl, carbendazim and 1-naphthol in dried lime tree flowers. The trilinearity property of three-way data was used to handle the intrinsic fluorescence of lime flowers and the difference in the fluorescence intensity of each analyte. It also made possible to identify unequivocally each analyte. Trilinearity of the data tensor guarantees the uniqueness of the solution obtained through parallel factor analysis (PARAFAC), so the factors of the decomposition match up with the analytes. In addition, an experimental procedure was proposed to identify, with three-way data, the quenching effect produced by the fluorophores of the lime flowers. This procedure also enabled the selection of the adequate dilution of the lime flowers extract to minimize the quenching effect so the three analytes can be quantified. Finally, the analytes were determined using the standard addition method for a calibration whose standards were chosen with a D-optimal design. The three analytes were unequivocally identified by the correlation between the pure spectra and the PARAFAC excitation and emission spectral loadings. The trueness was established by the accuracy line "calculated concentration versus added concentration" in all cases. Better decision limit values (CCα), in x0=0 with the probability of false positive fixed at 0.05, were obtained for the calibration performed in pure solvent: 2.97 μg L(-1) for 1-naphthol, 3.74 μg L(-1) for carbaryl and 23.25 μg L(-1) for carbendazim. The CCα values for the second calibration carried out in matrix were 1.61, 4.34 and 51.75 μg L(-1) respectively; while the values obtained considering only the pure samples as calibration set were: 2.65, 8.61 and 28.7 μg L(-1), respectively. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Green procedure using limonene in the Dean-Stark apparatus for moisture determination in food products.

    PubMed

    Veillet, Sébastien; Tomao, Valérie; Ruiz, Karine; Chemat, Farid

    2010-07-26

    In the past 10 years, trends in analytical chemistry have turned toward the green chemistry which endeavours to develop new techniques that reduce the influence of chemicals on the environment. The challenge of the green analytical chemistry is to develop techniques that meet the request for information output while reducing the environmental impact of the analyses. For this purpose petroleum-based solvents have to be avoided. Therefore, increasing interest was given to new green solvents such as limonene and their potential as alternative solvents in analytical chemistry. In this work limonene was used instead of toluene in the Dean-Stark procedure. Moisture determination on wide range of food matrices was performed either using toluene or limonene. Both solvents gave similar water percentages in food materials, i.e. 89.3+/-0.5 and 89.5+/-0.7 for carrot, 68.0+/-0.7 and 68.6+/-1.9 for garlic, 64.1+/-0.5 and 64.0+/-0.3 for minced meat with toluene and limonene, respectively. Consequently limonene could be used as a good alternative solvent in the Dean-Stark procedure. Copyright 2010 Elsevier B.V. All rights reserved.

  6. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  7. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  8. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  9. Estimating effects of limiting factors with regression quantiles

    USGS Publications Warehouse

    Cade, B.S.; Terrell, J.W.; Schroeder, R.L.

    1999-01-01

    In a recent Concepts paper in Ecology, Thomson et al. emphasized that assumptions of conventional correlation and regression analyses fundamentally conflict with the ecological concept of limiting factors, and they called for new statistical procedures to address this problem. The analytical issue is that unmeasured factors may be the active limiting constraint and may induce a pattern of unequal variation in the biological response variable through an interaction with the measured factors. Consequently, changes near the maxima, rather than at the center of response distributions, are better estimates of the effects expected when the observed factor is the active limiting constraint. Regression quantiles provide estimates for linear models fit to any part of a response distribution, including near the upper bounds, and require minimal assumptions about the form of the error distribution. Regression quantiles extend the concept of one-sample quantiles to the linear model by solving an optimization problem of minimizing an asymmetric function of absolute errors. Rank-score tests for regression quantiles provide tests of hypotheses and confidence intervals for parameters in linear models with heteroscedastic errors, conditions likely to occur in models of limiting ecological relations. We used selected regression quantiles (e.g., 5th, 10th, ..., 95th) and confidence intervals to test hypotheses that parameters equal zero for estimated changes in average annual acorn biomass due to forest canopy cover of oak (Quercus spp.) and oak species diversity. Regression quantiles also were used to estimate changes in glacier lily (Erythronium grandiflorum) seedling numbers as a function of lily flower numbers, rockiness, and pocket gopher (Thomomys talpoides fossor) activity, data that motivated the query by Thomson et al. for new statistical procedures. Both example applications showed that effects of limiting factors estimated by changes in some upper regression quantile (e

  10. Contemporary sample stacking in analytical electrophoresis.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2011-01-01

    Sample stacking is of vital importance for analytical CE since it may bring the required sensitivity of analyses. A lot of new relevant papers are published every year and regular surveys seem to be very helpful for experts and practitioners. The contribution presented here is a continuation of a series of regularly published reviews on the topic and covers the last two years. It brings a survey of related literature organized, in accord with the main principle used in the procedure published, in the following mainstream sections: Kohlrausch adjustment of concentrations, pH step, micellar systems and combined techniques. Each part covers literature sorted according to the field of application as, e.g. clinical, pharmaceutical, food, environmental, etc. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Short communication: Principal components and factor analytic models for test-day milk yield in Brazilian Holstein cattle.

    PubMed

    Bignardi, A B; El Faro, L; Rosa, G J M; Cardoso, V L; Machado, P F; Albuquerque, L G

    2012-04-01

    A total of 46,089 individual monthly test-day (TD) milk yields (10 test-days), from 7,331 complete first lactations of Holstein cattle were analyzed. A standard multivariate analysis (MV), reduced rank analyses fitting the first 2, 3, and 4 genetic principal components (PC2, PC3, PC4), and analyses that fitted a factor analytic structure considering 2, 3, and 4 factors (FAS2, FAS3, FAS4), were carried out. The models included the random animal genetic effect and fixed effects of the contemporary groups (herd-year-month of test-day), age of cow (linear and quadratic effects), and days in milk (linear effect). The residual covariance matrix was assumed to have full rank. Moreover, 2 random regression models were applied. Variance components were estimated by restricted maximum likelihood method. The heritability estimates ranged from 0.11 to 0.24. The genetic correlation estimates between TD obtained with the PC2 model were higher than those obtained with the MV model, especially on adjacent test-days at the end of lactation close to unity. The results indicate that for the data considered in this study, only 2 principal components are required to summarize the bulk of genetic variation among the 10 traits. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Uncertainty of relative sensitivity factors in glow discharge mass spectrometry

    NASA Astrophysics Data System (ADS)

    Meija, Juris; Methven, Brad; Sturgeon, Ralph E.

    2017-10-01

    The concept of the relative sensitivity factors required for the correction of the measured ion beam ratios in pin-cell glow discharge mass spectrometry is examined in detail. We propose a data-driven model for predicting the relative response factors, which relies on a non-linear least squares adjustment and analyte/matrix interchangeability phenomena. The model provides a self-consistent set of response factors for any analyte/matrix combination of any element that appears as either an analyte or matrix in at least one known response factor.

  13. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples.

    PubMed

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart; Larsen, Pia Bükmann

    2017-12-01

    Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability in incurred samples. We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used to determine stability of each analyte. Additionally, evaporation from the decapped blood collection tubes and the residual platelet count in the plasma after centrifugation were quantified. We report a post-analytical stability of most routine analytes of ≥8h and do therefore - with few exceptions - suggest a standard 8hour-time limit for reordering and reanalysis of analytes in incurred samples. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. A design procedure for a tension-wire stiffened truss-column

    NASA Technical Reports Server (NTRS)

    Greene, W. H.

    1980-01-01

    A deployable, tension wire stiffened, truss column configuration was considered for space structure applications. An analytical procedure, developed for design of the truss column and exercised in numerical studies, was based on equivalent beam stiffness coefficients in the classical analysis for an initially imperfect beam column. Failure constraints were formulated to be used in a combined weight/strength and nonlinear mathematical programming automated design procedure to determine the minimum mass column for a particular combination of design load and length. Numerical studies gave the mass characteristics of the truss column for broad ranges of load and length. Comparisons of the truss column with a baseline tubular column used a special structural efficiency parameter for this class of columns.

  15. Factors Associated with Anxiety About Colonoscopy: The Preparation, the Procedure, and the Anticipated Findings.

    PubMed

    Shafer, L A; Walker, J R; Waldman, C; Yang, C; Michaud, V; Bernstein, C N; Hathout, L; Park, J; Sisler, J; Restall, G; Wittmeier, K; Singh, H

    2018-03-01

    Previous research has assessed anxiety around colonoscopy procedures, but has not considered anxiety related to different aspects related to the colonoscopy process. Before colonoscopy, we assessed anxiety about: bowel preparation, the procedure, and the anticipated results. We evaluated associations between patient characteristics and anxiety in each area. An anonymous survey was distributed to patients immediately prior to their outpatient colonoscopy in six hospitals and two ambulatory care centers in Winnipeg, Canada. Anxiety was assessed using a visual analog scale. For each aspect, logistic regression models were used to explore associations between patient characteristics and high anxiety. A total of 1316 respondents completed the questions about anxiety (52% female, median age 56 years). Anxiety scores > 70 (high anxiety) were reported by 18% about bowel preparation, 29% about the procedure, and 28% about the procedure results. High anxiety about bowel preparation was associated with female sex, perceived unclear instructions, unfinished laxative, and no previous colonoscopies. High anxiety about the procedure was associated with female sex, no previous colonoscopies, and confusing instructions. High anxiety about the results was associated with symptoms as an indication for colonoscopy and instructions perceived as confusing. Fewer people had high anxiety about preparation than about the procedure and findings of the procedure. There are unique predictors of anxiety about each colonoscopy aspect. Understanding the nuanced differences in aspects of anxiety may help to design strategies to reduce anxiety, leading to improved acceptance of the procedure, compliance with preparation instructions, and less discomfort with the procedure.

  16. Interpretation of analytical toxicology results in life and at postmortem.

    PubMed

    Flanagan, Robert J; Connally, Geraldine

    2005-01-01

    Interpretation of analytical toxicology results from live patients is sometimes difficult. Possible factors may be related to: (i) the nature of the poison(s) present; (ii) sample collection, transport and storage; (iii) the analytical methodology used; (iv) the circumstances of exposure; (v) mechanical factors such as trauma or inhalation of stomach contents; and (vi) pharmacological factors such as tolerance or synergy. In some circumstances, detection of a drug or other poison may suffice to prove exposure. At the other extreme, the interpretation of individual measurements may be simplified by regulation. Examples here include whole blood alcohol (ethanol) in regard to driving a motor vehicle and blood lead assays performed to assess occupational exposure. With pharmaceuticals, the plasma or serum concentrations of drugs and metabolites attained during treatment often provide a basis for the interpretation of quantitative measurements. With illicit drugs, comparative information from casework may be all that is available. Postmortem toxicology is an especially complex area since changes in the composition of fluids such as blood depending on the site of collection from the body and the time elapsed since death, amongst other factors, may influence the result obtained. This review presents information to assist in the interpretation of analytical results, especially regarding postmortem toxicology. Collection and analysis of not only peripheral blood, but also other fluids/tissues is usually important in postmortem work. Alcohol, for example, can be either lost from, or produced in, blood especially if there has been significant trauma, hence measurements in urine or vitreous humour are needed to confirm the reliability of a blood result. Measurement of metabolites may also be valuable in individual cases.

  17. Clustering in analytical chemistry.

    PubMed

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  18. Analytical Ferrography Standardization.

    DTIC Science & Technology

    1982-01-01

    AD-AII6 508 MECHANICAL TECHNOLOGY INC LATHAM NY RESEARCH AND 0EV--ETC F/6 7/4 ANALYTICAL FERROGRAPHY STANDARDIZATION. (U) JAN 82 P A SENHOLZI, A S...ii Mwl jutio7 Unimte SMechanical Technology Incorporated Research and Development Division ReerhadDvlpetDvso I FINAL REPORT ANALYTICAL FERROGRAPHY ...Final Report MTI Technical Report No. 82TRS6 ANALYTICAL FERROGRAPHY STANDARDIZATION P. B. Senholzi A. S. Maciejewski Applications Engineering Mechanical

  19. Fundamentals of Sports Analytics.

    PubMed

    Wasserman, Erin B; Herzog, Mackenzie M; Collins, Christy L; Morris, Sarah N; Marshall, Stephen W

    2018-07-01

    Recently, the importance of statistics and analytics in sports has increased. This review describes measures of sports injury and fundamentals of sports injury research with a brief overview of some of the emerging measures of sports performance. We describe research study designs that can be used to identify risk factors for injury, injury surveillance programs, and common measures of injury risk and association. Finally, we describe measures of physical performance and training and considerations for using these measures. This review provides sports medicine clinicians with an understanding of current research measures and considerations for designing sports injury research studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Human Factors in Field Experimentation Design and Analysis of Analytical Suppression Model

    DTIC Science & Technology

    1978-09-01

    men in uf"an-dachine- Systems " supports the development of new doctrines, design of weapon systems as well as training programs for trQops. One...Experimentation Design -Master’s thesis: and Analysis.of an Analytical Suppression.Spebr17 Model PR@~w 3.RPR 7. AUTHOR(@) COT RIETeo 31AN? wijMu~aw...influences to suppression. Techniques are examined for including. the suppre.ssive effects of weapon systems in Lanchester-type combat m~odels, whir~h may be