[Automated analyzer of enzyme immunoassay].
Osawa, S
1995-09-01
Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.
Energy Analytics Campaign > 2014-2018 Assessment of Automated M&V Methods > 2012-2018 Better Assessment of automated measurement and verification methods. Granderson, J. et al. Lawrence Berkeley . PDF, 726 KB Performance Metrics and Objective Testing Methods for Energy Baseline Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Jordana R.; Gill, Gary A.; Kuo, Li-Jung
2016-04-20
Trace element determinations in seawater by inductively coupled plasma mass spectrometry are analytically challenging due to the typically very low concentrations of the trace elements and the potential interference of the salt matrix. In this study, we did a comparison for uranium analysis using inductively coupled plasma mass spectrometry (ICP-MS) of Sequim Bay seawater samples and three seawater certified reference materials (SLEW-3, CASS-5 and NASS-6) using seven different analytical approaches. The methods evaluated include: direct analysis, Fe/Pd reductive precipitation, standard addition calibration, online automated dilution using an external calibration with and without matrix matching, and online automated pre-concentration. The methodmore » which produced the most accurate results was the method of standard addition calibration, recovering uranium from a Sequim Bay seawater sample at 101 ± 1.2%. The on-line preconcentration method and the automated dilution with matrix-matched calibration method also performed well. The two least effective methods were the direct analysis and the Fe/Pd reductive precipitation using sodium borohydride« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torres, P.; Luque de Castro, M.D.
1996-12-31
A fully automated method for the determination of organochlorine pesticides in vegetables is proposed. The overall system acts as an {open_quotes}analytical black box{close_quotes} because a robotic station performs the prelimninary operations, from weighing to capping the leached analytes and location in an autosampler of an automated gas chromatograph with electron capture detection. The method has been applied to the determination of lindane, heptachlor, captan, chlordane and metoxcychlor in tea, marjoram, cinnamon, pennyroyal, and mint with good results in most cases. A gas chromatograph has been interfaced to a robotic station for the determination of pesticides in vegetables. 15 refs., 4more » figs., 2 tabs.« less
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H
1997-01-01
The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
Automated UHPLC separation of 10 pharmaceutical compounds using software-modeling.
Zöldhegyi, A; Rieger, H-J; Molnár, I; Fekhretdinova, L
2018-03-20
Human mistakes are still one of the main reasons of underlying regulatory affairs that in a compliance with FDA's Data Integrity and Analytical Quality by Design (AQbD) must be eliminated. To develop smooth, fast and robust methods that are free of human failures, a state-of-the-art automation was presented. For the scope of this study, a commercial software (DryLab) and a model mixture of 10 drugs were subjected to testing. Following AQbD-principles, the best available working point was selected and conformational experimental runs, i.e. the six worst cases of the conducted robustness calculation, were performed. Simulated results were found to be in excellent agreement with the experimental ones, proving the usefulness and effectiveness of an automated, software-assisted analytical method development. Copyright © 2018. Published by Elsevier B.V.
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.
Ates, Ebru; Mittendorf, Klaus; Senyuva, Hamide
2013-01-01
An automated sample preparation technique involving cleanup and analytical separation in a single operation using an online coupled TurboFlow (RP-LC system) is reported. This method eliminates time-consuming sample preparation steps that can be potential sources for cross-contamination in the analysis of plasticizers. Using TurboFlow chromatography, liquid samples were injected directly into the automated system without previous extraction or cleanup. Special cleanup columns enabled specific binding of target compounds; higher MW compounds, i.e., fats and proteins, and other matrix interferences with different chemical properties were removed to waste, prior to LC/MS/MS. Systematic stepwise method development using this new technology in the food safety area is described. Selection of optimum columns and mobile phases for loading onto the cleanup column followed by transfer onto the analytical column and MS detection are critical method parameters. The method was optimized for the assay of 10 phthalates (dimethyl, diethyl, dipropyl, butyl benzyl, diisobutyl, dicyclohexyl, dihexyl, diethylhexyl, diisononyl, and diisododecyl) and one adipate (diethylhexyl) in beverages and milk.
Method of multi-dimensional moment analysis for the characterization of signal peaks
Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A
2012-10-23
A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca
2015-04-01
According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.
Analytical Energy Gradients for Excited-State Coupled-Cluster Methods
NASA Astrophysics Data System (ADS)
Wladyslawski, Mark; Nooijen, Marcel
The equation-of-motion coupled-cluster (EOM-CC) and similarity transformed equation-of-motion coupled-cluster (STEOM-CC) methods have been firmly established as accurate and routinely applicable extensions of single-reference coupled-cluster theory to describe electronically excited states. An overview of these methods is provided, with emphasis on the many-body similarity transform concept that is the key to a rationalization of their accuracy. The main topic of the paper is the derivation of analytical energy gradients for such non-variational electronic structure approaches, with an ultimate focus on obtaining their detailed algebraic working equations. A general theoretical framework using Lagrange's method of undetermined multipliers is presented, and the method is applied to formulate the EOM-CC and STEOM-CC gradients in abstract operator terms, following the previous work in [P.G. Szalay, Int. J. Quantum Chem. 55 (1995) 151] and [S.R. Gwaltney, R.J. Bartlett, M. Nooijen, J. Chem. Phys. 111 (1999) 58]. Moreover, the systematics of the Lagrange multiplier approach is suitable for automation by computer, enabling the derivation of the detailed derivative equations through a standardized and direct procedure. To this end, we have developed the SMART (Symbolic Manipulation and Regrouping of Tensors) package of automated symbolic algebra routines, written in the Mathematica programming language. The SMART toolkit provides the means to expand, differentiate, and simplify equations by manipulation of the detailed algebraic tensor expressions directly. The Lagrangian multiplier formulation establishes a uniform strategy to perform the automated derivation in a standardized manner: A Lagrange multiplier functional is constructed from the explicit algebraic equations that define the energy in the electronic method; the energy functional is then made fully variational with respect to all of its parameters, and the symbolic differentiations directly yield the explicit equations for the wavefunction amplitudes, the Lagrange multipliers, and the analytical gradient via the perturbation-independent generalized Hellmann-Feynman effective density matrix. This systematic automated derivation procedure is applied to obtain the detailed gradient equations for the excitation energy (EE-), double ionization potential (DIP-), and double electron affinity (DEA-) similarity transformed equation-of-motion coupled-cluster singles-and-doubles (STEOM-CCSD) methods. In addition, the derivatives of the closed-shell-reference excitation energy (EE-), ionization potential (IP-), and electron affinity (EA-) equation-of-motion coupled-cluster singles-and-doubles (EOM-CCSD) methods are derived. Furthermore, the perturbative EOM-PT and STEOM-PT gradients are obtained. The algebraic derivative expressions for these dozen methods are all derived here uniformly through the automated Lagrange multiplier process and are expressed compactly in a chain-rule/intermediate-density formulation, which facilitates a unified modular implementation of analytic energy gradients for CCSD/PT-based electronic methods. The working equations for these analytical gradients are presented in full detail, and their factorization and implementation into an efficient computer code are discussed.
Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M
2013-06-15
Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.
Néri-Quiroz, José; Canto, Fabrice; Guillerme, Laurent; Couston, Laurent; Magnaldo, Alastair; Dugas, Vincent
2016-10-01
A miniaturized and automated approach for the determination of free acidity in solutions containing uranium (VI) is presented. The measurement technique is based on the concept of sequential injection analysis with on-line spectroscopic detection. The proposed methodology relies on the complexation and alkalimetric titration of nitric acid using a pH 5.6 sodium oxalate solution. The titration process is followed by UV/VIS detection at 650nm thanks to addition of Congo red as universal pH indicator. Mixing sequence as well as method validity was investigated by numerical simulation. This new analytical design allows fast (2.3min), reliable and accurate free acidity determination of low volume samples (10µL) containing uranium/[H(+)] moles ratio of 1:3 with relative standard deviation of <7.0% (n=11). The linearity range of the free nitric acid measurement is excellent up to 2.77molL(-1) with a correlation coefficient (R(2)) of 0.995. The method is specific, presence of actinide ions up to 0.54molL(-1) does not interfere on the determination of free nitric acid. In addition to automation, the developed sequential injection analysis method greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight fold. These analytical parameters are important especially in nuclear-related applications to improve laboratory safety, personnel exposure to radioactive samples and to drastically reduce environmental impacts or analytical radioactive waste. Copyright © 2016 Elsevier B.V. All rights reserved.
Human versus automation in responding to failures: an expected-value analysis
NASA Technical Reports Server (NTRS)
Sheridan, T. B.; Parasuraman, R.
2000-01-01
A simple analytical criterion is provided for deciding whether a human or automation is best for a failure detection task. The method is based on expected-value decision theory in much the same way as is signal detection. It requires specification of the probabilities of misses (false negatives) and false alarms (false positives) for both human and automation being considered, as well as factors independent of the choice--namely, costs and benefits of incorrect and correct decisions as well as the prior probability of failure. The method can also serve as a basis for comparing different modes of automation. Some limiting cases of application are discussed, as are some decision criteria other than expected value. Actual or potential applications include the design and evaluation of any system in which either humans or automation are being considered.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
Generic and Automated Data Evaluation in Analytical Measurement.
Adam, Martin; Fleischer, Heidi; Thurow, Kerstin
2017-04-01
In the past year, automation has become more and more important in the field of elemental and structural chemical analysis to reduce the high degree of manual operation and processing time as well as human errors. Thus, a high number of data points are generated, which requires fast and automated data evaluation. To handle the preprocessed export data from different analytical devices with software from various vendors offering a standardized solution without any programming knowledge should be preferred. In modern laboratories, multiple users will use this software on multiple personal computers with different operating systems (e.g., Windows, Macintosh, Linux). Also, mobile devices such as smartphones and tablets have gained growing importance. The developed software, Project Analytical Data Evaluation (ADE), is implemented as a web application. To transmit the preevaluated data from the device software to the Project ADE, the exported XML report files are detected and the included data are imported into the entities database using the Data Upload software. Different calculation types of a sample within one measurement series (e.g., method validation) are identified using information tags inside the sample name. The results are presented in tables and diagrams on different information levels (general, detailed for one analyte or sample).
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.
Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.
MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory
Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang
2005-01-01
MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721
Chung, Kun Ho; Choi, Sang Do; Choi, Geun Sik; Kang, Mun Ja
2013-11-01
A modular automated radionuclide separator for (99)Tc (MARS Tc-99) has been developed for the rapid and reproducible separation of technetium in groundwater samples. The control software of MARS Tc-99 was developed in the LabView programming language. An automated radiochemical method for separating (99)Tc was developed and validated by the purification of (99m)Tc tracer solution eluted from a commercial (99)Mo/(99m)Tc generator. The chemical recovery and analytical time for this radiochemical method were found to be 96 ± 2% and 81 min, respectively. Copyright © 2013 Elsevier Ltd. All rights reserved.
The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...
Automated indirect immunofluorescence evaluation of antinuclear autoantibodies on HEp-2 cells.
Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried
2012-01-01
Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability.
Automated Indirect Immunofluorescence Evaluation of Antinuclear Autoantibodies on HEp-2 Cells
Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried
2012-01-01
Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability. PMID:23251220
ERIC Educational Resources Information Center
Mendiburo, Maria; Williams, Laura; Segedy, James; Hasselbring, Ted
2013-01-01
In this paper, the authors explore the use of learning analytics as a method for easing the cognitive demands on teachers implementing the HALF instructional model. Learning analytics has been defined as "the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and…
Lehotay, Steven J; Han, Lijun; Sapozhnikova, Yelena
2016-01-01
This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography-tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. Cleanup efficiencies and breakthrough volumes using different mini-SPE sorbents were compared using avocado, salmon, pork loin, and kale as representative matrices. Optimum extract load volume was 300 µL for the 45 mg mini-cartridges containing 20/12/12/1 (w/w/w/w) anh. MgSO 4 /PSA (primary secondary amine)/C 18 /CarbonX sorbents used in the final method. In method validation to demonstrate high-throughput capabilities and performance results, 230 spiked extracts of 10 different foods (apple, kiwi, carrot, kale, orange, black olive, wheat grain, dried basil, pork, and salmon) underwent automated mini-SPE cleanup and analysis over the course of 5 days. In all, 325 analyses for 54 pesticides and 43 environmental contaminants (3 analyzed together) were conducted using the 10 min LPGC-MS/MS method without changing the liner or retuning the instrument. Merely, 1 mg equivalent sample injected achieved <5 ng g -1 limits of quantification. With the use of internal standards, method validation results showed that 91 of the 94 analytes including pairs achieved satisfactory results (70-120 % recovery and RSD ≤ 25 %) in the 10 tested food matrices ( n = 160). Matrix effects were typically less than ±20 %, mainly due to the use of analyte protectants, and minimal human review of software data processing was needed due to summation function integration of analyte peaks. This study demonstrated that the automated mini-SPE + LPGC-MS/MS method yielded accurate results in rugged, high-throughput operations with minimal labor and data review.
Rapid methods for the isolation of actinides Sr, Tc and Po from raw urine.
McAlister, Daniel R; Horwitz, E Philip; Harvey, James T
2011-08-01
Rapid methods for the isolation and analysis of individual actinides (Th, U, Pu, Am/Cm) and Sr, Tc and Po from small volumes of raw urine have been developed. The methods involve acidification of the sample and the addition of aluminum nitrate or aluminum chloride salting-out agent prior to isolation of the desired analyte using a tandem combination of prefilter material and extraction chromatographic resin. The method has been applied to the separation of individual analytes from spiked urine samples. Analytes were recovered in high yield and radionuclide purity with separation times as low as 30 min. The chemistry employed is compatible with automation on the ARSIIe instrument.
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
NASA Astrophysics Data System (ADS)
Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon
2017-03-01
Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.
NASA Technical Reports Server (NTRS)
Ling, A. C.; Macpherson, L. H.; Rey, M.
1981-01-01
The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.
Tak For Yu, Zeta; Guan, Huijiao; Ki Cheung, Mei; McHugh, Walker M.; Cornell, Timothy T.; Shanley, Thomas P.; Kurabayashi, Katsuo; Fu, Jianping
2015-01-01
Immunoassays represent one of the most popular analytical methods for detection and quantification of biomolecules. However, conventional immunoassays such as ELISA and flow cytometry, even though providing high sensitivity and specificity and multiplexing capability, can be labor-intensive and prone to human error, making them unsuitable for standardized clinical diagnoses. Using a commercialized no-wash, homogeneous immunoassay technology (‘AlphaLISA’) in conjunction with integrated microfluidics, herein we developed a microfluidic immunoassay chip capable of rapid, automated, parallel immunoassays of microliter quantities of samples. Operation of the microfluidic immunoassay chip entailed rapid mixing and conjugation of AlphaLISA components with target analytes before quantitative imaging for analyte detections in up to eight samples simultaneously. Aspects such as fluid handling and operation, surface passivation, imaging uniformity, and detection sensitivity of the microfluidic immunoassay chip using AlphaLISA were investigated. The microfluidic immunoassay chip could detect one target analyte simultaneously for up to eight samples in 45 min with a limit of detection down to 10 pg mL−1. The microfluidic immunoassay chip was further utilized for functional immunophenotyping to examine cytokine secretion from human immune cells stimulated ex vivo. Together, the microfluidic immunoassay chip provides a promising high-throughput, high-content platform for rapid, automated, parallel quantitative immunosensing applications. PMID:26074253
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Christiaens, B; Chiap, P; Rbeida, O; Cello, D; Crommen, J; Hubert, Ph
2003-09-25
A new fully automated method for the quantitative analysis of an antiandrogenic substance, cyproterone acetate (CPA), in plasma samples has been developed using on-line solid-phase extraction (SPE) prior to the determination by reversed-phase liquid chromatography (LC). The automated method was based on the use of a precolumn packed with an internal-surface reversed-phase packing material (LiChrospher RP-4 ADS) for sample clean-up coupled to LC analysis on an octadecyl stationary phase using a column-switching system. A 200-microL volume of plasma sample was injected directly on the precolumn packed with restricted access material using a mixture of water-acetonitrile (90:10, v/v) as washing liquid. The analyte was then eluted in the back-flush mode with the LC mobile phase which consisted of a mixture of phosphate buffer, pH 7.0-acetonitrile (54:46, v/v). The elution profiles of CPA and blank plasma samples on the precolumn and the time needed for analyte transfer from the precolumn to the analytical column were determined. Different compositions of washing liquid and mobile phase were tested to reduce the interference of plasma endogenous components. UV detection was achieved at 280 nm. Finally, the developed method was validated using a new approach, namely the application of the accuracy profile based on the interval confidence at 90% of the total measurement error (bias+standard deviation). The limit of quantification of cyproterone acetate in plasma was determined at 15 ng mL(-1). The validated method should be applicable to the determination of CPA in patients treated by at least 50 mg day(-1).
NASA Astrophysics Data System (ADS)
Wang, Ke; Guo, Ping; Luo, A.-Li
2017-03-01
Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.
Automation, consolidation, and integration in autoimmune diagnostics.
Tozzoli, Renato; D'Aurizio, Federica; Villalta, Danilo; Bizzaro, Nicola
2015-08-01
Over the past two decades, we have witnessed an extraordinary change in autoimmune diagnostics, characterized by the progressive evolution of analytical technologies, the availability of new tests, and the explosive growth of molecular biology and proteomics. Aside from these huge improvements, organizational changes have also occurred which brought about a more modern vision of the autoimmune laboratory. The introduction of automation (for harmonization of testing, reduction of human error, reduction of handling steps, increase of productivity, decrease of turnaround time, improvement of safety), consolidation (combining different analytical technologies or strategies on one instrument or on one group of connected instruments) and integration (linking analytical instruments or group of instruments with pre- and post-analytical devices) opened a new era in immunodiagnostics. In this article, we review the most important changes that have occurred in autoimmune diagnostics and present some models related to the introduction of automation in the autoimmunology laboratory, such as automated indirect immunofluorescence and changes in the two-step strategy for detection of autoantibodies; automated monoplex immunoassays and reduction of turnaround time; and automated multiplex immunoassays for autoantibody profiling.
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
An automated protocol for performance benchmarking a widefield fluorescence microscope.
Halter, Michael; Bier, Elianna; DeRose, Paul C; Cooksey, Gregory A; Choquette, Steven J; Plant, Anne L; Elliott, John T
2014-11-01
Widefield fluorescence microscopy is a highly used tool for visually assessing biological samples and for quantifying cell responses. Despite its widespread use in high content analysis and other imaging applications, few published methods exist for evaluating and benchmarking the analytical performance of a microscope. Easy-to-use benchmarking methods would facilitate the use of fluorescence imaging as a quantitative analytical tool in research applications, and would aid the determination of instrumental method validation for commercial product development applications. We describe and evaluate an automated method to characterize a fluorescence imaging system's performance by benchmarking the detection threshold, saturation, and linear dynamic range to a reference material. The benchmarking procedure is demonstrated using two different materials as the reference material, uranyl-ion-doped glass and Schott 475 GG filter glass. Both are suitable candidate reference materials that are homogeneously fluorescent and highly photostable, and the Schott 475 GG filter glass is currently commercially available. In addition to benchmarking the analytical performance, we also demonstrate that the reference materials provide for accurate day to day intensity calibration. Published 2014 Wiley Periodicals Inc. Published 2014 Wiley Periodicals Inc. This article is a US government work and, as such, is in the public domain in the United States of America.
Patton, Charles J.; Kryskalla, Jennifer R.
2013-01-01
A multiyear research effort at the U.S. Geological Survey (USGS) National Water Quality Laboratory (NWQL) evaluated several commercially available nitrate reductase (NaR) enzymes as replacements for toxic cadmium in longstanding automated colorimetric air-segmented continuous-flow analyzer (CFA) methods for determining nitrate plus nitrite (NOx) in water. This research culminated in USGS approved standard- and low-level enzymatic reduction, colorimetric automated discrete analyzer NOx methods that have been in routine operation at the NWQL since October 2011. The enzyme used in these methods (AtNaR2) is a product of recombinant expression of NaR from Arabidopsis thaliana (L.) Heynh. (mouseear cress) in the yeast Pichia pastoris. Because the scope of the validation report for these new automated discrete analyzer methods, published as U.S. Geological Survey Techniques and Methods 5–B8, was limited to performance benchmarks and operational details, extensive foundational research with different enzymes—primarily YNaR1, a product of recombinant expression of NaR from Pichia angusta in the yeast Pichia pastoris—remained unpublished until now. This report documents research and development at the NWQL that was foundational to development and validation of the discrete analyzer methods. It includes: (1) details of instrumentation used to acquire kinetics data for several NaR enzymes in the presence and absence of known or suspected inhibitors in relation to reaction temperature and reaction pH; and (2) validation results—method detection limits, precision and bias estimates, spike recoveries, and interference studies—for standard- and low-level automated colorimetric CFA-YNaR1 reduction NOx methods in relation to corresponding USGS approved CFA cadmium-reduction (CdR) NOx methods. The cornerstone of this validation is paired sample statistical and graphical analysis of NOx concentrations from more than 3,800 geographically and seasonally diverse surface-water and groundwater samples that were analyzed in parallel by CFA-CdR and CFA enzyme-reduction methods. Finally, (3) demonstration of a semiautomated batch procedure in which 2-milliliter analyzer cups or disposable spectrophotometer cuvettes serve as reaction vessels for enzymatic reduction of nitrate to nitrite prior to analytical determinations. After the reduction step, analyzer cups are loaded onto CFA, flow injection, or discrete analyzers for simple, rapid, automatic nitrite determinations. In the case of manual determinations, analysts dispense colorimetric reagents into cuvettes containing post-reduction samples, allow time for color to develop, insert cuvettes individually into a spectrophotometer, and record percent transmittance or absorbance in relation to a reagent blank. Data presented here demonstrate equivalent analytical performance of enzymatic reduction NOx methods in these various formats to that of benchmark CFA-CdR NOx methods.
NASA Astrophysics Data System (ADS)
Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi
2016-06-01
A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.
Vogeser, Michael; Spöhrer, Ute
2006-01-01
Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.
Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang
2013-06-01
Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.
Kinematic synthesis of adjustable robotic mechanisms
NASA Astrophysics Data System (ADS)
Chuenchom, Thatchai
1993-01-01
Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.
David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat
2013-10-25
Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Overview of open resources to support automated structure verification and elucidation
Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...
Automating the process for locating no-passing zones using georeferencing data.
DOT National Transportation Integrated Search
2012-08-01
This research created a method of using global positioning system (GPS) coordinates to identify the location of no-passing zones in two-lane highways. Analytical algorithms were developed for analyzing the availability of sight distance along the ali...
Eichhold, Thomas H; McCauley-Myers, David L; Khambe, Deepa A; Thompson, Gary A; Hoke, Steven H
2007-01-17
A method for the simultaneous determination of dextromethorphan (DEX), dextrorphan (DET), and guaifenesin (GG) in human plasma was developed, validated, and applied to determine plasma concentrations of these compounds in samples from six clinical pharmacokinetic (PK) studies. Semi-automated liquid handling systems were used to perform the majority of the sample manipulation including liquid/liquid extraction (LLE) of the analytes from human plasma. Stable-isotope-labeled analogues were utilized as internal standards (ISTDs) for each analyte to facilitate accurate and precise quantification. Extracts were analyzed using gradient liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Use of semi-automated LLE with LC-MS/MS proved to be a very rugged and reliable approach for analysis of more than 6200 clinical study samples. The lower limit of quantification was validated at 0.010, 0.010, and 1.0 ng/mL of plasma for DEX, DET, and GG, respectively. Accuracy and precision of quality control (QC) samples for all three analytes met FDA Guidance criteria of +/-15% for average QC accuracy with coefficients of variation less than 15%. Data from the thorough evaluation of the method during development, validation, and application are presented to characterize selectivity, linearity, over-range sample analysis, accuracy, precision, autosampler carry-over, ruggedness, extraction efficiency, ionization suppression, and stability. Pharmacokinetic data are also provided to illustrate improvements in systemic drug and metabolite concentration-time profiles that were achieved by formulation optimization.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
JPRS Report, Science & Technology, Japan
1988-10-05
collagen, we are conducting research on the immobilization, through chemical bond rather than physical absorption , of collagen on synthetic material...of a large number of samples are conducted by using automated apparatus and enzymatic reagents, it is natural to devise a method to use natural...improvement of enzymatic analytical methods ; 3) development of reaction system and instrumentation system; 4) research on sample treatment methods ; and
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
NASA Astrophysics Data System (ADS)
Ivanova, V.; Surleva, A.; Koleva, B.
2018-06-01
An ion chromatographic method for determination of fluoride, chloride, nitrate and sulphate in untreated and treated drinking waters was described. An automated 850 IC Professional, Metrohm system equipped with conductivity detector and Metrosep A Supp 7-250 (250 x 4 mm) column was used. The validation of the method was performed for simultaneous determination of all studied analytes and the results have showed that the validated method fits the requirements of the current water legislation. The main analytical characteristics were estimated for each of studied analytes: limits of detection, limits of quantification, working and linear ranges, repeatability and intermediate precision, recovery. The trueness of the method was estimated by analysis of certified reference material for soft drinking water. Recovery test was performed on spiked drinking water samples. An uncertainty was estimated. The method was applied for analysis of drinking waters before and after chlorination.
Automated methods for multiplexed pathogen detection.
Straub, Timothy M; Dockendorff, Brian P; Quiñonez-Díaz, Maria D; Valdez, Catherine O; Shutthanandan, Janani I; Tarasevich, Barbara J; Grate, Jay W; Bruckner-Lea, Cynthia J
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides "live vs. dead" capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.
Automated Methods for Multiplexed Pathogen Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cyclermore » where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.« less
NASA Astrophysics Data System (ADS)
Nabavi, N.
2018-07-01
The author investigates the monitoring methods for fine adjustment of the previously proposed on-chip architecture for frequency multiplication and translation of harmonics by design. Digital signal processing (DSP) algorithms are utilized to create an optimized microwave photonic integrated circuit functionality toward automated frequency multiplication. The implemented DSP algorithms are formed on discrete Fourier transform and optimization-based algorithms (Greedy and gradient-based algorithms), which are analytically derived and numerically compared based on the accuracy and speed of convergence criteria.
Applications of Machine Learning and Rule Induction,
1995-02-15
An important area of application for machine learning is in automating the acquisition of knowledge bases required for expert systems. In this paper...we review the major paradigms for machine learning , including neural networks, instance-based methods, genetic learning, rule induction, and analytic
[Morphometry of pulmonary tissue: From manual to high throughput automation].
Sallon, C; Soulet, D; Tremblay, Y
2017-12-01
Weibel's research has shown that any alteration of the pulmonary structure has effects on function. This demonstration required a quantitative analysis of lung structures called morphometry. This is possible thanks to stereology, a set of methods based on principles of geometry and statistics. His work has helped to better understand the morphological harmony of the lung, which is essential for its proper functioning. An imbalance leads to pathophysiology such as chronic obstructive pulmonary disease in adults and bronchopulmonary dysplasia in neonates. It is by studying this imbalance that new therapeutic approaches can be developed. These advances are achievable only through morphometric analytical methods, which are increasingly precise and focused, in particular thanks to the high-throughput automation of these methods. This review makes a comparison between an automated method that we developed in the laboratory and semi-manual methods of morphometric analyzes. The automation of morphometric measurements is a fundamental asset in the study of pulmonary pathophysiology because it is an assurance of robustness, reproducibility and speed. This tool will thus contribute significantly to the acceleration of the race for the development of new drugs. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
van Delft, Sanne; Goedhart, Annelijn; Spigt, Mark; van Pinxteren, Bart; de Wit, Niek; Hopstaken, Rogier
2016-01-01
Objective Point-of-care testing (POCT) urinalysis might reduce errors in (subjective) reading, registration and communication of test results, and might also improve diagnostic outcome and optimise patient management. Evidence is lacking. In the present study, we have studied the analytical performance of automated urinalysis and visual urinalysis compared with a reference standard in routine general practice. Setting The study was performed in six general practitioner (GP) group practices in the Netherlands. Automated urinalysis was compared with visual urinalysis in these practices. Reference testing was performed in a primary care laboratory (Saltro, Utrecht, The Netherlands). Primary and secondary outcome measures Analytical performance of automated and visual urinalysis compared with the reference laboratory method was the primary outcome measure, analysed by calculating sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) and Cohen's κ coefficient for agreement. Secondary outcome measure was the user-friendliness of the POCT analyser. Results Automated urinalysis by experienced and routinely trained practice assistants in general practice performs as good as visual urinalysis for nitrite, leucocytes and erythrocytes. Agreement for nitrite is high for automated and visual urinalysis. κ's are 0.824 and 0.803 (ranked as very good and good, respectively). Agreement with the central laboratory reference standard for automated and visual urinalysis for leucocytes is rather poor (0.256 for POCT and 0.197 for visual, respectively, ranked as fair and poor). κ's for erythrocytes are higher: 0.517 (automated) and 0.416 (visual), both ranked as moderate. The Urisys 1100 analyser was easy to use and considered to be not prone to flaws. Conclusions Automated urinalysis performed as good as traditional visual urinalysis on reading of nitrite, leucocytes and erythrocytes in routine general practice. Implementation of automated urinalysis in general practice is justified as automation is expected to reduce human errors in patient identification and transcribing of results. PMID:27503860
van Delft, Sanne; Goedhart, Annelijn; Spigt, Mark; van Pinxteren, Bart; de Wit, Niek; Hopstaken, Rogier
2016-08-08
Point-of-care testing (POCT) urinalysis might reduce errors in (subjective) reading, registration and communication of test results, and might also improve diagnostic outcome and optimise patient management. Evidence is lacking. In the present study, we have studied the analytical performance of automated urinalysis and visual urinalysis compared with a reference standard in routine general practice. The study was performed in six general practitioner (GP) group practices in the Netherlands. Automated urinalysis was compared with visual urinalysis in these practices. Reference testing was performed in a primary care laboratory (Saltro, Utrecht, The Netherlands). Analytical performance of automated and visual urinalysis compared with the reference laboratory method was the primary outcome measure, analysed by calculating sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) and Cohen's κ coefficient for agreement. Secondary outcome measure was the user-friendliness of the POCT analyser. Automated urinalysis by experienced and routinely trained practice assistants in general practice performs as good as visual urinalysis for nitrite, leucocytes and erythrocytes. Agreement for nitrite is high for automated and visual urinalysis. κ's are 0.824 and 0.803 (ranked as very good and good, respectively). Agreement with the central laboratory reference standard for automated and visual urinalysis for leucocytes is rather poor (0.256 for POCT and 0.197 for visual, respectively, ranked as fair and poor). κ's for erythrocytes are higher: 0.517 (automated) and 0.416 (visual), both ranked as moderate. The Urisys 1100 analyser was easy to use and considered to be not prone to flaws. Automated urinalysis performed as good as traditional visual urinalysis on reading of nitrite, leucocytes and erythrocytes in routine general practice. Implementation of automated urinalysis in general practice is justified as automation is expected to reduce human errors in patient identification and transcribing of results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Sarkozi, Laszlo; Simson, Elkin; Ramanathan, Lakshmi
2003-03-01
Thirty-six years of data and history of laboratory practice at our institution has enabled us to follow the effects of analytical automation, then recently pre-analytical and post-analytical automation on productivity, cost reduction and enhanced quality of service. In 1998, we began the operation of a pre- and post-analytical automation system (robotics), together with an advanced laboratory information system to process specimens prior to analysis, deliver them to various automated analytical instruments, specimen outlet racks and finally to refrigerated stockyards. By the end of 3 years of continuous operation, we compared the chemistry part of the system with the prior 33 years and quantitated the financial impact of the various stages of automation. Between 1965 and 2000, the Consumer Price Index increased by a factor of 5.5 in the United States. During the same 36 years, at our institution's Chemistry Department the productivity (indicated as the number of reported test results/employee/year) increased from 10,600 to 104,558 (9.3-fold). When expressed in constant 1965 dollars, the total cost per test decreased from 0.79 dollars to 0.15 dollars. Turnaround time for availability of results on patient units decreased to the extent that Stat specimens requiring a turnaround time of <1 h do not need to be separately prepared or prioritized on the system. Our experience shows that the introduction of a robotics system for perianalytical automation has brought a large improvement in productivity together with decreased operational cost. It enabled us to significantly increase our workload together with a reduction of personnel. In addition, stats are handled easily and there are benefits such as safer working conditions and improved sample identification, which are difficult to quantify at this stage.
Vakh, Christina; Evdokimova, Ekaterina; Pochivalov, Aleksei; Moskvin, Leonid; Bulatov, Andrey
2017-12-15
An easily performed fully automated and miniaturized flow injection chemiluminescence (CL) method for determination of phenols in smoked food samples has been proposed. This method includes the ultrasound assisted solid-liquid extraction coupled with gas-diffusion separation of phenols from smoked food sample and analytes absorption into a NaOH solution in a specially designed gas-diffusion cell. The flow system was designed to focus on automation and miniaturization with minimal sample and reagent consumption by inexpensive instrumentation. The luminol - N-bromosuccinimide system in an alkaline medium was used for the CL determination of phenols. The limit of detection of the proposed procedure was 3·10 -8 ·molL -1 (0.01mgkg -1 ) in terms of phenol. The presented method demonstrated to be a good tool for easy, rapid and cost-effective point-of-need screening phenols in smoked food samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mirski, Tomasz; Bartoszcze, Michał; Bielawska-Drózd, Agata; Cieślik, Piotr; Michalski, Aleksander J; Niemcewicz, Marcin; Kocik, Janusz; Chomiczewski, Krzysztof
2014-01-01
Modern threats of bioterrorism force the need to develop methods for rapid and accurate identification of dangerous biological agents. Currently, there are many types of methods used in this field of studies that are based on immunological or genetic techniques, or constitute a combination of both methods (immuno-genetic). There are also methods that have been developed on the basis of physical and chemical properties of the analytes. Each group of these analytical assays can be further divided into conventional methods (e.g. simple antigen-antibody reactions, classical PCR, real-time PCR), and modern technologies (e.g. microarray technology, aptamers, phosphors, etc.). Nanodiagnostics constitute another group of methods that utilize the objects at a nanoscale (below 100 nm). There are also integrated and automated diagnostic systems, which combine different methods and allow simultaneous sampling, extraction of genetic material and detection and identification of the analyte using genetic, as well as immunological techniques.
Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei
2016-01-01
A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China. Copyright © 2015 Elsevier B.V. All rights reserved.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou
2013-03-01
An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.
Estelles-Lopez, Lucia; Ropodi, Athina; Pavlidis, Dimitris; Fotopoulou, Jenny; Gkousari, Christina; Peyrodie, Audrey; Panagou, Efstathios; Nychas, George-John; Mohareb, Fady
2017-09-01
Over the past decade, analytical approaches based on vibrational spectroscopy, hyperspectral/multispectral imagining and biomimetic sensors started gaining popularity as rapid and efficient methods for assessing food quality, safety and authentication; as a sensible alternative to the expensive and time-consuming conventional microbiological techniques. Due to the multi-dimensional nature of the data generated from such analyses, the output needs to be coupled with a suitable statistical approach or machine-learning algorithms before the results can be interpreted. Choosing the optimum pattern recognition or machine learning approach for a given analytical platform is often challenging and involves a comparative analysis between various algorithms in order to achieve the best possible prediction accuracy. In this work, "MeatReg", a web-based application is presented, able to automate the procedure of identifying the best machine learning method for comparing data from several analytical techniques, to predict the counts of microorganisms responsible of meat spoilage regardless of the packaging system applied. In particularly up to 7 regression methods were applied and these are ordinary least squares regression, stepwise linear regression, partial least square regression, principal component regression, support vector regression, random forest and k-nearest neighbours. MeatReg" was tested with minced beef samples stored under aerobic and modified atmosphere packaging and analysed with electronic nose, HPLC, FT-IR, GC-MS and Multispectral imaging instrument. Population of total viable count, lactic acid bacteria, pseudomonads, Enterobacteriaceae and B. thermosphacta, were predicted. As a result, recommendations of which analytical platforms are suitable to predict each type of bacteria and which machine learning methods to use in each case were obtained. The developed system is accessible via the link: www.sorfml.com. Copyright © 2017 Elsevier Ltd. All rights reserved.
2014-01-01
Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules’ analytical quality was evaluated for ADEs. Results In terms of recall, 89.5% of ADEs with hyperkalaemia “with or without an abnormal symptom” were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. Conclusions The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases. PMID:25212108
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
The science of visual analysis at extreme scale
NASA Astrophysics Data System (ADS)
Nowell, Lucy T.
2011-01-01
Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.
Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning
ERIC Educational Resources Information Center
D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela
2017-01-01
It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…
Design, development, test, and evaluation of an automated analytical electrophoresis apparatus
NASA Technical Reports Server (NTRS)
Bartels, P. A.; Bier, M.
1977-01-01
An Automated Analytical Electrophoresis Apparatus (AAEA) was designed, developed, assembled, and preliminarily tested. The AAEA was demonstrated to be a feasible apparatus for automatically acquiring, displaying, and storing (and eventually analyzing) electrophoresis mobility data from living blood cells. The apparatus and the operation of its major assemblies are described in detail.
USDA-ARS?s Scientific Manuscript database
The QuEChERS (quick, easy, cheap, effective, rugged, and safe) sample preparation method was modified to accommodate various cereal grain matrices (corn, oat, rice and wheat) and provide good analytical results (recoveries in the range of 70-120% and RSDs <20%) for the majority of the target pestici...
Ma, Junlong; Wang, Chengbin; Yue, Jiaxin; Li, Mianyang; Zhang, Hongrui; Ma, Xiaojing; Li, Xincui; Xue, Dandan; Qing, Xiaoyan; Wang, Shengjiang; Xiang, Daijun; Cong, Yulong
2013-01-01
Several automated urine sediment analyzers have been introduced to clinical laboratories. Automated microscopic pattern recognition is a new technique for urine particle analysis. We evaluated the analytical and diagnostic performance of the UriSed automated microscopic analyzer and compared with manual microscopy for urine sediment analysis. Precision, linearity, carry-over, and method comparison were carried out. A total of 600 urine samples sent for urinalysis were assessed using the UriSed automated microscopic analyzer and manual microscopy. Within-run and between-run precision of the UriSed for red blood cells (RBC) and white blood cells (WBC) were acceptable at all levels (CV < 20%). Within-run and between-run imprecision of the UriSed testing for cast, squamous epithelial cells (EPI), and bacteria (BAC) were good at middle level and high level (CV < 20%). The linearity analysis revealed substantial agreement between the measured value and the theoretical value of the UriSed for RBC, WBC, cast, EPI, and BAC (r > 0.95). There was no carry-over. RBC, WBC, and squamous epithelial cells with sensitivities and specificities were more than 80% in this study. There is substantial agreement between the UriSed automated microscopic analyzer and the manual microscopy methods. The UriSed provides for a rapid turnaround time.
Huppertz, Laura M; Kneisel, Stefan; Auwärter, Volker; Kempf, Jürgen
2014-02-01
Considering the vast variety of synthetic cannabinoids and herbal mixtures - commonly known as 'Spice' or 'K2' - on the market and the resulting increase of severe intoxications related to their consumption, there is a need in clinical and forensic toxicology for comprehensive up-to-date screening methods. The focus of this project aimed at developing and implementing an automated screening procedure for the detection of synthetic cannabinoids in serum using a liquid chromatography-ion trap-MS (LC-MS(n)) system and a spectra library-based approach, currently including 46 synthetic cannabinoids and 8 isotope labelled analogues. In the process of method development, a high-temperature ESI source (IonBooster(TM), Bruker Daltonik) and its effects on the ionization efficiency of the investigated synthetic cannabinoids were evaluated and compared to a conventional ESI source. Despite their structural diversity, all investigated synthetic cannabinoids benefitted from high-temperature ionization by showing remarkably higher MS intensities compared to conventional ESI. The employed search algorithm matches retention time, MS and MS(2)/MS(3) spectra. With the utilization of the ionBooster source, limits for the automated detection comparable to cut-off values of routine MRM methods were achieved for the majority of analytes. Even compounds not identified when using a conventional ESI source were detected using the ionBooster-source. LODs in serum range from 0.1 ng/ml to 0.5 ng/ml. The use of parent compounds as analytical targets offers the possibility of instantly adding new emerging compounds to the library and immediately applying the updated method to serum samples, allowing the rapid adaptation of the screening method to ongoing forensic or clinical requirements. The presented approach can also be applied to other specimens, such as oral fluid or hair, and herbal mixtures and was successfully applied to authentic serum samples. Quantitative MRM results of samples with analyte concentrations above the determined LOD were confirmed as positive findings by the presented method. Copyright © 2014 John Wiley & Sons, Ltd.
Rodriguez-Mozaz, Sara; de Alda, Maria J López; Barceló, Damià
2006-04-15
This work describes the application of an optical biosensor (RIver ANALyser, RIANA) to the simultaneous analysis of three relevant environmental organic pollutants, namely, the pesticides atrazine and isoproturon and the estrogen estrone, in real water samples. This biosensor is based on an indirect inhibition immunoassay which takes place at a chemically modified optical transducer chip. The spatially resolved modification of the transducer surface allows the simultaneous determination of selected target analytes by means of "total internal reflection fluorescence" (TIRF). The performance of the immunosensor method developed was evaluated against a well accepted traditional method based on solid-phase extraction followed by liquid chromatography-mass spectrometry (LC-MS). The chromatographic method was superior in terms of linearity, sensitivity and accuracy, and the biosensor method in terms of repeatability, speed, cost and automation. The application of both methods in parallel to determine the occurrence and removal of atrazine, isoproturon and estrone throughout the treatment process (sand filtration, ozonation, activated carbon filtration and chlorination) in a waterworks showed an overestimation of results in the case of the biosensor, which was partially attributed to matrix and cross-reactivity effects, in spite of the addition of ovalbumin to the sample to minimize matrix interferences. Based on the comparative performance of both techniques, the biosensor emerges as a suitable tool for fast, simple and automated screening of water pollutants without sample pretreatment. To the author's knowledge, this is the first description of the application of the biosensor RIANA in the multi-analyte configuration to the regular monitoring of pollutants in a waterworks.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
The current role of on-line extraction approaches in clinical and forensic toxicology.
Mueller, Daniel M
2014-08-01
In today's clinical and forensic toxicological laboratories, automation is of interest because of its ability to optimize processes, to reduce manual workload and handling errors and to minimize exposition to potentially infectious samples. Extraction is usually the most time-consuming step; therefore, automation of this step is reasonable. Currently, from the field of clinical and forensic toxicology, methods using the following on-line extraction techniques have been published: on-line solid-phase extraction, turbulent flow chromatography, solid-phase microextraction, microextraction by packed sorbent, single-drop microextraction and on-line desorption of dried blood spots. Most of these published methods are either single-analyte or multicomponent procedures; methods intended for systematic toxicological analysis are relatively scarce. However, the use of on-line extraction will certainly increase in the near future.
Ohura, Hiroki; Imato, Toshihiko
2011-01-01
Two analytical methods, which prove the utility of a potentiometric flow injection technique for determining various redox species, based on the use of some redox potential buffers, are reviewed. The first is a potentiometric flow injection method in which a redox couple such as Fe(III)-Fe(II), Fe(CN)6 3−-Fe(CN)(CN)6 4−, and bromide-bromine and a redox electrode or a combined platinum-bromide ion selective electrode are used. The analytical principle and advantages of the method are discussed, and several examples of its application are reported. Another example is a highly sensitive potentiometric flow injection method, in which a large transient potential change due to bromine or chlorine as an intermediate, generated during the reaction of the oxidative species with an Fe(III)-Fe(II) potential buffer containing bromide or chloride, is utilized. The analytical principle and details of the proposed method are described, and examples of several applications are described. The determination of trace amounts of hydrazine, based on the detection of a transient change in potential caused by the reaction with a Ce(IV)-Ce(III) potential buffer, is also described. PMID:21584280
Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat
2016-01-01
Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post‐analytical quality control procedures are necessary in order to ensure satisfactory performance. PMID:27499923
Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.
Sampling probe for microarray read out using electrospray mass spectrometry
Van Berkel, Gary J.
2004-10-12
An automated electrospray based sampling system and method for analysis obtains samples from surface array spots having analytes. The system includes at least one probe, the probe including an inlet for flowing at least one eluting solvent to respective ones of a plurality of spots and an outlet for directing the analyte away from the spots. An automatic positioning system is provided for translating the probe relative to the spots to permit sampling of any spot. An electrospray ion source having an input fluidicly connected to the probe receives the analyte and generates ions from the analyte. The ion source provides the generated ions to a structure for analysis to identify the analyte, preferably being a mass spectrometer. The probe can be a surface contact probe, where the probe forms an enclosing seal along the periphery of the array spot surface.
Means and method of detection in chemical separation procedures
Yeung, Edward S.; Koutny, Lance B.; Hogan, Barry L.; Cheung, Chan K.; Ma, Yinfa
1993-03-09
A means and method for indirect detection of constituent components of a mixture separated in a chemical separation process. Fluorescing ions are distributed across the area in which separation of the mixture will occur to provide a generally uniform background fluorescence intensity. For example, the mixture is comprised of one or more charged analytes which displace fluorescing ions where its constituent components separate to. Fluorescing ions of the same charge as the charged analyte components cause a displacement. The displacement results in the location of the separated components having a reduced fluorescence intensity to the remainder of the background. Detection of the lower fluorescence intensity areas can be visually, by photographic means and methods, or by automated laser scanning.
Means and method of detection in chemical separation procedures
Yeung, E.S.; Koutny, L.B.; Hogan, B.L.; Cheung, C.K.; Yinfa Ma.
1993-03-09
A means and method are described for indirect detection of constituent components of a mixture separated in a chemical separation process. Fluorescing ions are distributed across the area in which separation of the mixture will occur to provide a generally uniform background fluorescence intensity. For example, the mixture is comprised of one or more charged analytes which displace fluorescing ions where its constituent components separate to. Fluorescing ions of the same charge as the charged analyte components cause a displacement. The displacement results in the location of the separated components having a reduced fluorescence intensity to the remainder of the background. Detection of the lower fluorescence intensity areas can be visually, by photographic means and methods, or by automated laser scanning.
Chang, Ying-Chia; Chen, Wen-Ling; Bai, Fang-Yu; Chen, Pau-Chung; Wang, Gen-Shuh; Chen, Chia-Yang
2012-01-01
For this study, we developed methods of determining ten perfluorinated chemicals in drinking water, milk, fish, beef, and pig liver using high-flow automated solid-phase extraction (SPE) and ultra-high performance liquid chromatography/tandem mass spectrometry. The analytes were separated on a core-shell Kinetex C18 column. The mobile phase was composed of methanol and 10-mM N-methylmorpholine. Milk was digested with 0.5 N potassium hydroxide in Milli-Q water, and was extracted with an Atlantic HLB disk to perform automated SPE at a flow rate ranged from 70 to 86 mL/min. Drinking water was directly extracted by the SPE. Solid food samples were digested in alkaline methanol and their supernatants were diluted and also processed by SPE. The disks were washed with 40% methanol/60% water and then eluted with 0.1% ammonium hydroxide in methanol. Suppression of signal intensity of most analytes by matrixes was lower than 50%; it was generally lower in fish and drinking water but higher in liver. Most quantitative biases and relative standard deviations were lower than 15%. The limits of detection for most analytes were sub-nanograms per liter for drinking water and sub-nanograms per gram for solid food samples. This method greatly shortened the time and labor needed for digestion, SPE, and liquid chromatography. This method has been applied to analyze 14 types of food samples. Perfluorooctanoic acid was found to be the highest among the analytes (median at 3.2-64 ng/g wet weight), followed by perfluorodecanoic acid (0.7-25 ng/g) and perfluorododecanoic acid (0.6-15 ng/g).
Integrated multiplexed capillary electrophoresis system
Yeung, Edward S.; Tan, Hongdong
2002-05-14
The present invention provides an integrated multiplexed capillary electrophoresis system for the analysis of sample analytes. The system integrates and automates multiple components, such as chromatographic columns and separation capillaries, and further provides a detector for the detection of analytes eluting from the separation capillaries. The system employs multiplexed freeze/thaw valves to manage fluid flow and sample movement. The system is computer controlled and is capable of processing samples through reaction, purification, denaturation, pre-concentration, injection, separation and detection in parallel fashion. Methods employing the system of the invention are also provided.
[Developments in preparation and experimental method of solid phase microextraction fibers].
Yi, Xu; Fu, Yujie
2004-09-01
Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.
Quality specification in haematology: the automated blood cell count.
Buttarello, Mauro
2004-08-02
Quality specifications for automated blood cell counts include topics that go beyond the traditional analytic stage (imprecision, inaccuracy, quality control) and extend to pre- and post-analytic phases. In this review pre-analytic aspects concerning the choice of anticoagulants, maximum conservation times and differences between storage at room temperature or at 4 degrees C are considered. For the analytic phase, goals for imprecision and bias obtained with various approaches (ratio to biologic variation, state of the art, specific clinical situations) are evaluated. For the post-analytic phase, medical review criteria (algorithm, decision limit and delta check) and the structure of the report (general part and comments), which constitutes the formal act through which a laboratory communicates with clinicians, are considered. K2EDTA is considered the anticoagulant of choice for automated cell counts. Regarding storage, specimens should be analyzed as soon as possible. Storage at 4 degrees C may stabilize specimens from 24 to 72 h when complete blood count (CBC) and differential leucocyte count (DLC) is performed. For precision, analytical goals based on the state of the art are acceptable while for bias this is satisfactory only for some parameters. In haematology quality specifications for pre- and analytical phases are important, but the review criteria and the quality of the report play a central role in assuring a definite clinical value.
Device and method for automated separation of a sample of whole blood into aliquots
Burtis, Carl A.; Johnson, Wayne F.
1989-01-01
A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.
NASA Technical Reports Server (NTRS)
Boyle, W. G.; Barton, G. W.
1979-01-01
The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.
Peters, Sonja; Kaal, Erwin; Horsting, Iwan; Janssen, Hans-Gerd
2012-02-24
A new method is presented for the analysis of phenolic acids in plasma based on ion-pairing 'Micro-extraction in packed sorbent' (MEPS) coupled on-line to in-liner derivatisation-gas chromatography-mass spectrometry (GC-MS). The ion-pairing reagent served a dual purpose. It was used both to improve extraction yields of the more polar analytes and as the methyl donor in the automated in-liner derivatisation method. In this way, a fully automated procedure for the extraction, derivatisation and injection of a wide range of phenolic acids in plasma samples has been obtained. An extensive optimisation of the extraction and derivatisation procedure has been performed. The entire method showed excellent repeatabilities of under 10% and linearities of 0.99 or better for all phenolic acids. The limits of detection of the optimised method for the majority of phenolic acids were 10ng/mL or lower with three phenolic acids having less-favourable detection limits of around 100 ng/mL. Finally, the newly developed method has been applied in a human intervention trial in which the bioavailability of polyphenols from wine and tea was studied. Forty plasma samples could be analysed within 24h in a fully automated method including sample extraction, derivatisation and gas chromatographic analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
Quintana, Leonardo; Arias, Claudia; Cordoba, Jorge; Moroy, Magda; Pulido, Jean; Ramirez, Angela
2012-01-01
The aim of this study was to combine three different analytical methods from three different disciplines to diagnose the ergonomic conditions, manufacturing and supply chain operation of a baking company. The study explores a summary of comprehensive working methods that combines the ergonomics, automation and logistics study methods in the diagnosis of working conditions and productivity. The participatory approach of this type of study that involves the feelings and first-hand knowledge of workers of the operation are determining factors in defining points of action and ergonomic interventions, as well as defining opportunities in the automation of manufacturing and logistics, to cope with the needs of the company. The study identified an ergonomic situation (high prevalence of wrist-hand pain), and the combination of interdisciplinary techniques applied allowed to improve this condition in the company. This type of study allows a primary basis of the opportunities presented by the combination of specialized methods of different disciplines, for the definition of comprehensive action plans for the company. Additionally, it outlines opportunities for improvement and recommendations to mitigate the burden associated with occupational diseases and as an end result improve the quality of life and productivity of workers.
An Automated Baseline Correction Method Based on Iterative Morphological Operations.
Chen, Yunliang; Dai, Liankui
2018-05-01
Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.
USDA-ARS?s Scientific Manuscript database
Introduction – The diversity of structure and, particularly,stereochemical variation of the dehydropyrrolizidine alkaloids can present challenges for analysis and the isolation of pure compounds for the preparation of analytical standards and for toxicology studies. Objective – To investigate method...
World Wide Web Indexes and Hierarchical Lists: Finding Tools for the Internet.
ERIC Educational Resources Information Center
Munson, Kurt I.
1996-01-01
In World Wide Web indexing: (1) the creation process is automated; (2) the indexes are merely descriptive, not analytical of document content; (3) results may be sorted differently depending on the search engine; and (4) indexes link directly to the resources. This article compares the indexing methods and querying options of the search engines…
Fast automated online xylanase activity assay using HPAEC-PAD.
Cürten, Christin; Anders, Nico; Juchem, Niels; Ihling, Nina; Volkenborn, Kristina; Knapp, Andreas; Jaeger, Karl-Erich; Büchs, Jochen; Spiess, Antje C
2018-01-01
In contrast to biochemical reactions, which are often carried out under automatic control and maintained overnight, the automation of chemical analysis is usually neglected. Samples are either analyzed in a rudimentary fashion using in situ techniques, or aliquots are withdrawn and stored to facilitate more precise offline measurements, which can result in sampling and storage errors. Therefore, in this study, we implemented automated reaction control, sampling, and analysis. As an example, the activities of xylanases on xylotetraose and soluble xylan were examined using high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD). The reaction was performed in HPLC vials inside a temperature-controlled Dionex™ AS-AP autosampler. It was started automatically when the autosampler pipetted substrate and enzyme solution into the reaction vial. Afterwards, samples from the reaction vial were injected repeatedly for 60 min onto a CarboPac™ PA100 column for analysis. Due to the rapidity of the reaction, the analytical method and the gradient elution of 200 mM sodium hydroxide solution and 100 mM sodium hydroxide with 500 mM sodium acetate were adapted to allow for an overall separation time of 13 min and a detection limit of 0.35-1.83 mg/L (depending on the xylooligomer). This analytical method was applied to measure the soluble short-chain products (xylose, xylobiose, xylotriose, xylotetraose, xylopentaose, and longer xylooligomers) that arise during enzymatic hydrolysis. Based on that, the activities of three endoxylanases (EX) were determined as 294 U/mg for EX from Aspergillus niger, 1.69 U/mg for EX from Bacillus stearothermophilus, and 0.36 U/mg for EX from Bacillus subtilis. Graphical abstract Xylanase activity assay automation.
Déglon, Julien; Thomas, Aurélien; Daali, Youssef; Lauer, Estelle; Samer, Caroline; Desmeules, Jules; Dayer, Pierre; Mangin, Patrice; Staub, Christian
2011-01-25
This paper illustrates the development of an automated system for the on-line bioanalysis of dried blood spots (on-line DBS). In this way, a prototype was designed for integration into a conventional LC/MS/MS, allowing the successive extraction of 30 DBS toward the analytical system without any sample pretreatment. The developed method was assessed for the DBS analysis of flurbiprofen (FLB) and its metabolite 4-hydroxyflurbiprofen (OH-FLB) in human whole blood (i.e. 5 μL). The automated procedure was fully validated based on international criteria and showed good precision, trueness, and linearity over the expected concentration range (from 10 to 1000 ng/mL and 100 to 10,000 ng/mL for OH-FLB and FLB respectively). Furthermore, the prototype showed good results in terms of recovery and carry-over. Stability of both analytes on filter paper was also investigated and the results suggested that DBS could be stored at ambient temperature for over 1 month. The on-line DBS automated system was then successfully applied to a pharmacokinetic study performed on healthy male volunteers after oral administration of a single 50-mg dose of FLB. Additionally, a comparison between finger capillary DBS and classic venous plasma concentrations was investigated. A good correlation was observed, demonstrating the complementarity of both sampling forms. The automated system described in this article represents an efficient tool for the LC/MS/MS analysis of DBS samples in many bioanalytical applications. Copyright © 2010 Elsevier B.V. All rights reserved.
TIPS: A system for automated image-based phenotyping of maize tassels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gage, Joseph L.; Miller, Nathan D.; Spalding, Edgar P.
Here, the maize male inflorescence (tassel) produces pollen necessary for reproduction and commercial grain production of maize. The size of the tassel has been linked to factors affecting grain yield, so understanding the genetic control of tassel architecture is an important goal. Tassels are fragile and deform easily after removal from the plant, necessitating rapid measurement of any shape characteristics that cannot be retained during storage. Some morphological characteristics of tassels such as curvature and compactness are difficult to quantify using traditional methods, but can be quantified by image-based phenotyping tools. Lastly, these constraints necessitate the development of an efficientmore » method for capturing natural-state tassel morphology and complementary automated analytical methods that can quickly and reproducibly quantify traits of interest such as height, spread, and branch number.« less
TIPS: A system for automated image-based phenotyping of maize tassels
Gage, Joseph L.; Miller, Nathan D.; Spalding, Edgar P.; ...
2017-03-31
Here, the maize male inflorescence (tassel) produces pollen necessary for reproduction and commercial grain production of maize. The size of the tassel has been linked to factors affecting grain yield, so understanding the genetic control of tassel architecture is an important goal. Tassels are fragile and deform easily after removal from the plant, necessitating rapid measurement of any shape characteristics that cannot be retained during storage. Some morphological characteristics of tassels such as curvature and compactness are difficult to quantify using traditional methods, but can be quantified by image-based phenotyping tools. Lastly, these constraints necessitate the development of an efficientmore » method for capturing natural-state tassel morphology and complementary automated analytical methods that can quickly and reproducibly quantify traits of interest such as height, spread, and branch number.« less
Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter
2014-12-01
The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.
Ficheur, Grégoire; Chazard, Emmanuel; Beuscart, Jean-Baptiste; Merlin, Béatrice; Luyckx, Michel; Beuscart, Régis
2014-09-12
Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. We used a set of complex detection rules to take account of the patient's clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules' analytical quality was evaluated for ADEs. In terms of recall, 89.5% of ADEs with hyperkalaemia "with or without an abnormal symptom" were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases.
Analytical, Characterization, and Stability Studies of Organic Chemical, Drugs, and Drug Formulation
2014-05-21
stability studies was maintained over the entire contract period to ensure the continued integrity of the drug in its clinical use . Because our...facile automation. We demonstrated the method in principle, but were unable to remove the residual t-butanol to ɘ.5%. With additional research using ...to its use of ethylene oxide for sterilization, which is done in small batches. The generally recognized method of choice to produce a parenteral
Developing automated analytical methods for scientific environments using LabVIEW.
Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard
2010-01-15
The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-04
Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
NASA Astrophysics Data System (ADS)
Jabbari, Ali
2018-01-01
Surface inset permanent magnet DC machine can be used as an alternative in automation systems due to their high efficiency and robustness. Magnet segmentation is a common technique in order to mitigate pulsating torque components in permanent magnet machines. An accurate computation of air-gap magnetic field distribution is necessary in order to calculate machine performance. An exact analytical method for magnetic vector potential calculation in surface inset permanent magnet machines considering magnet segmentation has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in polar coordinate by using sub-domain method. One of the main contributions of the paper is to derive an expression for the magnetic vector potential in the segmented PM region by using hyperbolic functions. The developed method is applied on the performance computation of two prototype surface inset magnet segmented motors with open circuit and on load conditions. The results of these models are validated through FEM method.
The laboratory of the 1990s—Planning for total automation
Brunner, Linda A.
1992-01-01
The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925
Evangelopoulos, Angelos A; Dalamaga, Maria; Panoutsopoulos, Konstantinos; Dima, Kleanthi
2013-01-01
In the early 80s, the word automation was used in the clinical laboratory setting referring only to analyzers. But in late 80s and afterwards, automation found its way into all aspects of the diagnostic process, embracing not only the analytical but also the pre- and post-analytical phase. While laboratories in the eastern world, mainly Japan, paved the way for laboratory automation, US and European laboratories soon realized the benefits and were quick to follow. Clearly, automation and robotics will be a key survival tool in a very competitive and cost-concious healthcare market. What sets automation technology apart from so many other efficiency solutions are the dramatic savings that it brings to the clinical laboratory. Further standardization will assure the success of this revolutionary new technology. One of the main difficulties laboratory managers and personnel must deal with when studying solutions to reengineer a laboratory is familiarizing themselves with the multidisciplinary and technical terminology of this new and exciting field. The present review/glossary aims at giving an overview of the most frequently used terms within the scope of laboratory automation and to put laboratory automation on a sounder linguistic basis.
Real-time simulations for automated rendezvous and capture
NASA Technical Reports Server (NTRS)
Cuseo, John A.
1991-01-01
Although the individual technologies for automated rendezvous and capture (AR&C) exist, they have not yet been integrated to produce a working system in the United States. Thus, real-time integrated systems simulations are critical to the development and pre-flight demonstration of an AR&C capability. Real-time simulations require a level of development more typical of a flight system compared to purely analytical methods, thus providing confidence in derived design concepts. This presentation will describe Martin Marietta's Space Operations Simulation (SOS) Laboratory, a state-of-the-art real-time simulation facility for AR&C, along with an implementation for the Satellite Servicer System (SSS) Program.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-08-01
In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.
MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.
Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu
2012-06-01
In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.
Development of an automated asbestos counting software based on fluorescence microscopy.
Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio
2015-01-01
An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Mathieson, William; Guljar, Nafia; Sanchez, Ignacio; Sroya, Manveer; Thomas, Gerry A
2018-05-03
DNA extracted from formalin-fixed, paraffin-embedded (FFPE) tissue blocks is amenable to analytical techniques, including sequencing. DNA extraction protocols are typically long and complex, often involving an overnight proteinase K digest. Automated platforms that shorten and simplify the process are therefore an attractive proposition for users wanting a faster turn-around or to process large numbers of biospecimens. It is, however, unclear whether automated extraction systems return poorer DNA yields or quality than manual extractions performed by experienced technicians. We extracted DNA from 42 FFPE clinical tissue biospecimens using the QiaCube (Qiagen) and ExScale (ExScale Biospecimen Solutions) automated platforms, comparing DNA yields and integrities with those from manual extractions. The QIAamp DNA FFPE Spin Column Kit was used for manual and QiaCube DNA extractions and the ExScale extractions were performed using two of the manufacturer's magnetic bead kits: one extracting DNA only and the other simultaneously extracting DNA and RNA. In all automated extraction methods, DNA yields and integrities (assayed using DNA Integrity Numbers from a 4200 TapeStation and the qPCR-based Illumina FFPE QC Assay) were poorer than in the manual method, with the QiaCube system performing better than the ExScale system. However, ExScale was fastest, offered the highest reproducibility when extracting DNA only, and required the least intervention or technician experience. Thus, the extraction methods have different strengths and weaknesses, would appeal to different users with different requirements, and therefore, we cannot recommend one method over another.
A new automated colorimetric method for measuring total oxidant status.
Erel, Ozcan
2005-12-01
To develop a new, colorimetric and automated method for measuring total oxidation status (TOS). The assay is based on the oxidation of ferrous ion to ferric ion in the presence of various oxidant species in acidic medium and the measurement of the ferric ion by xylenol orange. The oxidation reaction of the assay was enhanced and precipitation of proteins was prevented. In addition, autoxidation of ferrous ion present in the reagent was prevented during storage. The method was applied to an automated analyzer, which was calibrated with hydrogen peroxide and the analytical performance characteristics of the assay were determined. There were important correlations with hydrogen peroxide, tert-butyl hydroperoxide and cumene hydroperoxide solutions (r=0.99, P<0.001 for all). In addition, the new assay presented a typical sigmoidal reaction pattern in copper-induced lipoprotein autoxidation. The novel assay is linear up to 200 micromol H2O2 Equiv./L and its precision value is lower than 3%. The lower detection limit is 1.13 micromol H2O2 Equiv./L. The reagents are stable for at least 6 months on the automated analyzer. Serum TOS level was significantly higher in patients with osteoarthritis (21.23+/-3.11 micromol H2O2 Equiv./L) than in healthy subjects (14.19+/-3.16 micromol H2O2 Equiv./L, P<0.001) and the results showed a significant negative correlation with total antioxidant capacity (TAC) (r=-0.66 P<0.01). This easy, stable, reliable, sensitive, inexpensive and fully automated method that is described can be used to measure total oxidant status.
Qin, Yuhong; Zhang, Jingru; Zhang, Yuan; Li, Fangbing; Han, Yongtao; Zou, Nan; Xu, Haowei; Qian, Meiyuan; Pan, Canping
2016-09-02
An automated multi-plug filtration cleanup (m-PFC) method on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. The automatic device was aimed to reduce labor-consuming manual operation workload in the cleanup steps. It could control the volume and the speed of pulling and pushing cycles accurately. In this work, m-PFC was based on multi-walled carbon nanotubes (MWCNTs) mixed with other sorbents and anhydrous magnesium sulfate (MgSO4) in a packed tip for analysis of pesticide multi-residues in crop commodities followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. It was validated by analyzing 25 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Salts, sorbents, m-PFC procedure, automated pulling and pushing volume, automated pulling speed, and pushing speed for each matrix were optimized. After optimization, two general automated m-PFC methods were introduced to relatively simple (apple, citrus fruit, peanut) and relatively complex (spinach, leek, green tea) matrices. Spike recoveries were within 83 and 108% and 1-14% RSD for most analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination >0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Liu, Lei; Liu, Kang-Ning; Wen, Ya-Bin; Zhang, Han-Wen; Lu, Ya-Xin; Yin, Zheng
2012-04-15
A fully automated on-line solid-phase extraction (SPE) and high-performance liquid chromatography (HPLC) with diode array detection (DAD) method was developed for determination of bavachinin in mouse plasma. Analytical process was performed on two reversed-phase columns (SPE cartridge and analytical column) connected via a Valco 6-port switching valve. Plasma samples (10 μL) were injected directly onto a C18 SPE cartridge (MF Ph-1 C18, 10 mm × 4 mm, 5 μm) and the biological matrix was washed out for 2 min with the loading solvent (5 mM NaH(2)PO(4) buffer, pH 3.5) at a flow rate of 1 mL/min. By rotation of the switching valve, bavachinin was eluted from the SPE cartridge in the back-flush mode and transferred to the analytical column (Venusil MP C18, 4.6 mm × 150 mm, 5 μm) by the chromatographic mobile phase consisted of acetonitrile-5mM NaH(2)PO(4) buffer 65/35 (v/v, pH 3.5) at a flow rate of 1 mL/min. The complete cycle of the on-line SPE purification and chromatographic separation of the analyte was 13 min with UV detection performed at 236 nm. Calibration curve with good linearity (r=0.9997) was obtained in the range of 20-4000 ng/mL in mouse plasma. The intra-day and inter-day precisions (RSD) of bavachinin were in the range of 0.20-2.32% and the accuracies were between 98.47% and 102.95%. The lower limit of quantification (LLOQ) of the assay was 20 ng/mL. In conclusion, the established automated on-line SPE-HPLC-DAD method demonstrated good performance in terms of linearity, specificity, detection and quantification limits, precision and accuracy, and was successfully utilized to quantify bavachinin in mouse plasma to support the pharmacokinetic (PK) studies. The PK properties of bavachinin were characterized as rapid oral absorption, high clearance, and poor absolute bioavailability. Copyright © 2012. Published by Elsevier B.V.
Automated determination of arterial input function for DCE-MRI of the prostate
NASA Astrophysics Data System (ADS)
Zhu, Yingxuan; Chang, Ming-Ching; Gupta, Sandeep
2011-03-01
Prostate cancer is one of the commonest cancers in the world. Dynamic contrast enhanced MRI (DCE-MRI) provides an opportunity for non-invasive diagnosis, staging, and treatment monitoring. Quantitative analysis of DCE-MRI relies on determination of an accurate arterial input function (AIF). Although several methods for automated AIF detection have been proposed in literature, none are optimized for use in prostate DCE-MRI, which is particularly challenging due to large spatial signal inhomogeneity. In this paper, we propose a fully automated method for determining the AIF from prostate DCE-MRI. Our method is based on modeling pixel uptake curves as gamma variate functions (GVF). First, we analytically compute bounds on GVF parameters for more robust fitting. Next, we approximate a GVF for each pixel based on local time domain information, and eliminate the pixels with false estimated AIFs using the deduced upper and lower bounds. This makes the algorithm robust to signal inhomogeneity. After that, according to spatial information such as similarity and distance between pixels, we formulate the global AIF selection as an energy minimization problem and solve it using a message passing algorithm to further rule out the weak pixels and optimize the detected AIF. Our method is fully automated without training or a priori setting of parameters. Experimental results on clinical data have shown that our method obtained promising detection accuracy (all detected pixels inside major arteries), and a very good match with expert traced manual AIF.
Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264
Automated solid-phase extraction and liquid chromatography for assay of cyclosporine in whole blood.
Kabra, P M; Wall, J H; Dimson, P
1987-12-01
In this rapid, precise, accurate, cost-effective, automated liquid-chromatographic procedure for determining cyclosporine in whole blood, the cyclosporine is extracted from 0.5 mL of whole blood together with 300 micrograms of cyclosporin D per liter, added as internal standard, by using an Advanced Automated Sample Processing unit. The on-line solid-phase extraction is performed on an octasilane sorbent cartridge, which is interfaced with a RP-8 guard column and an octyl analytical column, packed with 5-microns packing material. Both columns are eluted with a mobile phase containing acetonitrile/methanol/water (53/20/27 by vol) at a flow rate of 1.5 mL/min and column temperature of 70 degrees C. Absolute recovery of cyclosporine exceeded 85% and the standard curve was linear to 5000 micrograms/L. Within-run and day-to-day CVs were less than 8%. Correlation between automated and manual Bond-Elut extraction methods was excellent (r = 0.987). None of 18 drugs and four steroids tested interfered.
Mills, M.S.; Thurman, E.M.
1992-01-01
Reversed-phase isolation and ion-exchange purification were combined in the automated solid-phase extraction of two polar s-triazine metabolites, 2-amino-4-chloro-6-(isopropylamino)-s-triazine (deethylatrazine) and 2-amino-4-chloro-6-(ethylamino)-s-triazine (deisopropylatrazine) from clay-loam and slit-loam soils and sandy aquifer sediments. First, methanol/ water (4/1, v/v) soil extracts were transferred to an automated workstation following evaporation of the methanol phase for the rapid reversed-phase isolation of the metabolites on an octadecylresin (C18). The retention of the triazine metabolites on C18 decreased substantially when trace methanol concentrations (1%) remained. Furthermore, the retention on C18 increased with decreasing aqueous solubility and increasing alkyl-chain length of the metabolites and parent herbicides, indicating a reversed-phase interaction. The analytes were eluted with ethyl acetate, which left much of the soil organic-matter impurities on the resin. Second, the small-volume organic eluate was purified on an anion-exchange resin (0.5 mL/min) to extract the remaining soil pigments that could foul the ion source of the GC/MS system. Recoveries of the analytes were 75%, using deuterated atrazine as a surrogate, and were comparable to recoveries by soxhlet extraction. The detection limit was 0.1 ??g/kg with a coefficient of variation of 15%. The ease and efficiency of this automated method makes it viable, practical technique for studying triazine metabolites in the environment.
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grambow, Colin A.; Jamal, Adeel; Li, Yi -Pei
Ketohydroperoxides are important in liquid-phase autoxidation and in gas-phase partial oxidation and pre-ignition chemistry, but because of their low concentration, instability, and various analytical chemistry limitations, it has been challenging to experimentally determine their reactivity, and only a few pathways are known. In the present work, 75 elementary-step unimolecular reactions of the simplest γ-ketohydroperoxide, 3-hydroperoxypropanal, were discovered by a combination of density functional theory with several automated transition-state search algorithms: the Berny algorithm coupled with the freezing string method, single- and double-ended growing string methods, the heuristic KinBot algorithm, and the single-component artificial force induced reaction method (SC-AFIR). The presentmore » joint approach significantly outperforms previous manual and automated transition-state searches – 68 of the reactions of γ-ketohydroperoxide discovered here were previously unknown and completely unexpected. All of the methods found the lowest-energy transition state, which corresponds to the first step of the Korcek mechanism, but each algorithm except for SC-AFIR detected several reactions not found by any of the other methods. We show that the low-barrier chemical reactions involve promising new chemistry that may be relevant in atmospheric and combustion systems. Our study highlights the complexity of chemical space exploration and the advantage of combined application of several approaches. Altogether, the present work demonstrates both the power and the weaknesses of existing fully automated approaches for reaction discovery which suggest possible directions for further method development and assessment in order to enable reliable discovery of all important reactions of any specified reactant(s).« less
Grambow, Colin A.; Jamal, Adeel; Li, Yi -Pei; ...
2017-12-22
Ketohydroperoxides are important in liquid-phase autoxidation and in gas-phase partial oxidation and pre-ignition chemistry, but because of their low concentration, instability, and various analytical chemistry limitations, it has been challenging to experimentally determine their reactivity, and only a few pathways are known. In the present work, 75 elementary-step unimolecular reactions of the simplest γ-ketohydroperoxide, 3-hydroperoxypropanal, were discovered by a combination of density functional theory with several automated transition-state search algorithms: the Berny algorithm coupled with the freezing string method, single- and double-ended growing string methods, the heuristic KinBot algorithm, and the single-component artificial force induced reaction method (SC-AFIR). The presentmore » joint approach significantly outperforms previous manual and automated transition-state searches – 68 of the reactions of γ-ketohydroperoxide discovered here were previously unknown and completely unexpected. All of the methods found the lowest-energy transition state, which corresponds to the first step of the Korcek mechanism, but each algorithm except for SC-AFIR detected several reactions not found by any of the other methods. We show that the low-barrier chemical reactions involve promising new chemistry that may be relevant in atmospheric and combustion systems. Our study highlights the complexity of chemical space exploration and the advantage of combined application of several approaches. Altogether, the present work demonstrates both the power and the weaknesses of existing fully automated approaches for reaction discovery which suggest possible directions for further method development and assessment in order to enable reliable discovery of all important reactions of any specified reactant(s).« less
Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ticknor, Brian W.; Metzger, Shalina C.; McBay, Eddy H.
Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oakmore » Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.« less
Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C
2015-12-11
Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2) > 0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
Performance modeling of automated manufacturing systems
NASA Astrophysics Data System (ADS)
Viswanadham, N.; Narahari, Y.
A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Impact of automation on mass spectrometry.
Zhang, Yan Victoria; Rockwood, Alan
2015-10-23
Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.
Ito, Sana; Morita, Masaki
2016-01-01
Quantitative analysis of nitrilotriacetate (NTA) in detergents by titration with Cu 2+ solution using a copper ion selective electrode was achieved. This method tolerates a wide range of pH and ingredients in detergents. In addition to NTA, other chelating agents, having relatively lower stability constants toward Cu 2+ , were also qualified with sufficient accuracy by this analytical method for model detergent formulations. The titration process was automated by automatic titrating systems available commercially.
Polymeric assay film for direct colorimetric detection
Charych, Deborah; Nagy, Jon; Spevak, Wayne
2002-01-01
A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.
Polymeric assay film for direct colorimetric detection
Charych, Deborah; Nagy, Jon; Spevak, Wayne
1999-01-01
A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias
2015-05-01
Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.
An atmosphere protection subsystem in the thermal power station automated process control system
NASA Astrophysics Data System (ADS)
Parchevskii, V. M.; Kislov, E. A.
2014-03-01
Matters concerned with development of methodical and mathematical support for an atmosphere protection subsystem in the thermal power station automated process control system are considered taking as an example the problem of controlling nitrogen oxide emissions at a gas-and-oil-fired thermal power station. The combined environmental-and-economic characteristics of boilers, which correlate the costs for suppressing emissions with the boiler steam load and mass discharge of nitrogen oxides in analytic form, are used as the main tool for optimal control. A procedure for constructing and applying environmental-and-economic characteristics on the basis of technical facilities available in modern instrumentation and control systems is presented.
General Staining and Segmentation Procedures for High Content Imaging and Analysis.
Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S
2018-01-01
Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.
A Task Analytic Process to Define Future Concepts in Aviation
NASA Technical Reports Server (NTRS)
Gore, Brian Francis; Wolter, Cynthia A.
2014-01-01
A necessary step when developing next generation systems is to understand the tasks that operators will perform. One NextGen concept under evaluation termed Single Pilot Operations (SPO) is designed to improve the efficiency of airline operations. One SPO concept includes a Pilot on Board (PoB), a Ground Station Operator (GSO), and automation. A number of procedural changes are likely to result when such changes in roles and responsibilities are undertaken. Automation is expected to relieve the PoB and GSO of some tasks (e.g. radio frequency changes, loading expected arrival information). A major difference in the SPO environment is the shift to communication-cued crosschecks (verbal / automated) rather than movement-cued crosschecks that occur in a shared cockpit. The current article highlights a task analytic process of the roles and responsibilities between a PoB, an approach-phase GSO, and automation.
Automation effects in a multiloop manual control system
NASA Technical Reports Server (NTRS)
Hess, R. A.; Mcnally, B. D.
1986-01-01
An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
Scheven, U M
2013-12-01
This paper describes a new variant of established stimulated echo pulse sequences, and an analytical method for determining diffusion or dispersion coefficients for Gaussian or non-Gaussian displacement distributions. The unipolar displacement encoding PFGSTE sequence uses trapezoidal gradient pulses of equal amplitude g and equal ramp rates throughout while sampling positive and negative halves of q-space. Usefully, the equal gradient amplitudes and gradient ramp rates help to reduce the impact of experimental artefacts caused by residual amplifier transients, eddy currents, or ferromagnetic hysteresis in components of the NMR magnet. The pulse sequence was validated with measurements of diffusion in water and of dispersion in flow through a packing of spheres. The analytical method introduced here permits the robust determination of the variance of non-Gaussian, dispersive displacement distributions. The noise sensitivity of the analytical method is shown to be negligible, using a demonstration experiment with a non-Gaussian longitudinal displacement distribution, measured on flow through a packing of mono-sized spheres. Copyright © 2013 Elsevier Inc. All rights reserved.
Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen
2009-01-01
Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274
Santos, R S; Malheiros, S M F; Cavalheiro, S; de Oliveira, J M Parente
2013-03-01
Cancer is the leading cause of death in economically developed countries and the second leading cause of death in developing countries. Malignant brain neoplasms are among the most devastating and incurable forms of cancer, and their treatment may be excessively complex and costly. Public health decision makers require significant amounts of analytical information to manage public treatment programs for these patients. Data mining, a technology that is used to produce analytically useful information, has been employed successfully with medical data. However, the large-scale adoption of this technique has been limited thus far because it is difficult to use, especially for non-expert users. One way to facilitate data mining by non-expert users is to automate the process. Our aim is to present an automated data mining system that allows public health decision makers to access analytical information regarding brain tumors. The emphasis in this study is the use of ontology in an automated data mining process. The non-experts who tried the system obtained useful information about the treatment of brain tumors. These results suggest that future work should be conducted in this area. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl
2018-08-15
The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.
The "hospital central laboratory": automation, integration and clinical usefulness.
Zaninotto, Martina; Plebani, Mario
2010-07-01
Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.
Microchannel gel electrophoretic separation systems and methods for preparing and using
Herr, Amy E; Singh, Anup K; Throckmorton, Daniel J
2015-02-24
A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.
Microchannel gel electrophoretic separation systems and methods for preparing and using
Herr, Amy; Singh, Anup K; Throckmorton, Daniel J
2013-09-03
A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.
Tretzel, Laura; Thomas, Andreas; Piper, Thomas; Hedeland, Mikael; Geyer, Hans; Schänzer, Wilhelm; Thevis, Mario
2016-05-10
Dried blood spots (DBS) represent a sample matrix collected under minimal-invasive, straightforward and robust conditions. DBS specimens have been shown to provide appropriate test material for different analytical disciplines, e.g., preclinical drug development, therapeutic drug monitoring, forensic toxicology and diagnostic analysis of metabolic disorders in newborns. However, the sample preparation has occasionally been reported as laborious and time consuming. In order to minimize the manual workload and to substantiate the suitability of DBS for high sample-throughput, the automation of sample preparation processes is of paramount interest. In the current study, the development and validation of a fully automated DBS extraction method coupled to online solid-phase extraction using the example of nicotine, its major metabolites nornicotine, cotinine and trans-3'-hydroxycotinine and the tobacco alkaloids anabasine and anatabine is presented, based on the rationale that the use of nicotine-containing products for performance-enhancing purposes has been monitored by the World Anti-Doping Agency (WADA) for several years. Automation-derived DBS sample extracts were directed online to liquid chromatography high resolution/high mass accuracy tandem mass spectrometry, and target analytes were determined with support of four deuterated internal standards. Validation of the method yielded precise (CV <7.5% for intraday and <12.3% for interday measurements) and linear (r(2)>0.998) results. The limit of detection was established at 5 ng mL(-1) for all studied compounds, the extraction recovery ranged from 25 to 44%, and no matrix effects were observed. To exemplify the applicability of the DBS online-SPE LC-MS/MS approach for sports drug testing purposes, the method was applied to authentic DBS samples obtained from smokers, snus users, and e-cigarette users. Statistical evaluation of the obtained results indicated differences in metabolic behavior depending on the route of administration (inhalative versus buccal absorption) in terms of the ratio of nicotine and nornicotine. Copyright © 2016 Elsevier B.V. All rights reserved.
Li, Jie; Fang, Xiangming
2010-01-01
Automated geocoding of patient addresses is an important data assimilation component of many spatial epidemiologic studies. Inevitably, the geocoding process results in positional errors. Positional errors incurred by automated geocoding tend to reduce the power of tests for disease clustering and otherwise affect spatial analytic methods. However, there are reasons to believe that the errors may often be positively spatially correlated and that this may mitigate their deleterious effects on spatial analyses. In this article, we demonstrate explicitly that the positional errors associated with automated geocoding of a dataset of more than 6000 addresses in Carroll County, Iowa are spatially autocorrelated. Furthermore, through two simulation studies of disease processes, including one in which the disease process is overlain upon the Carroll County addresses, we show that spatial autocorrelation among geocoding errors maintains the power of two tests for disease clustering at a level higher than that which would occur if the errors were independent. Implications of these results for cluster detection, privacy protection, and measurement-error modeling of geographic health data are discussed. PMID:20087879
Ahene, Ago; Calonder, Claudio; Davis, Scott; Kowalchick, Joseph; Nakamura, Takahiro; Nouri, Parya; Vostiar, Igor; Wang, Yang; Wang, Jin
2014-01-01
In recent years, the use of automated sample handling instrumentation has come to the forefront of bioanalytical analysis in order to ensure greater assay consistency and throughput. Since robotic systems are becoming part of everyday analytical procedures, the need for consistent guidance across the pharmaceutical industry has become increasingly important. Pre-existing regulations do not go into sufficient detail in regard to how to handle the use of robotic systems for use with analytical methods, especially large molecule bioanalysis. As a result, Global Bioanalytical Consortium (GBC) Group L5 has put forth specific recommendations for the validation, qualification, and use of robotic systems as part of large molecule bioanalytical analyses in the present white paper. The guidelines presented can be followed to ensure that there is a consistent, transparent methodology that will ensure that robotic systems can be effectively used and documented in a regulated bioanalytical laboratory setting. This will allow for consistent use of robotic sample handling instrumentation as part of large molecule bioanalysis across the globe.
Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.
Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C
2012-08-07
In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.
Olivero, Sergio J Pérez; Trujillo, Juan P Pérez
2011-06-24
A new analytical method for the determination of nine short-chain fatty acids (acetic, propionic, isobutyric, butyric, isovaleric, 2-methylbutyric, hexanoic, octanoic and decanoic acids) in wines using the automated HS/SPME-GC-ITMS technique was developed and optimised. Five different SPME fibers were tested and the influence of different factors such as temperature and time of extraction, temperature and time of desorption, pH, strength ionic, tannins, anthocyans, SO(2), sugar and ethanol content were studied and optimised using model solutions. Some analytes showed matrix effect so a study of recoveries was performed. The proposed HS/SPME-GC-ITMS method, that covers the concentration range of the different analytes in wines, showed wide linear ranges, values of repeatability and reproducibility lower than 4.0% of RSD and detection limits between 3 and 257 μgL(-1), lower than the olfactory thresholds. The optimised method is a suitable technique for the quantitative analysis of short-chain fatty acids from the aliphatic series in real samples of white, rose and red wines. Copyright © 2011 Elsevier B.V. All rights reserved.
van Soest, Johan; Sun, Chang; Mussmann, Ole; Puts, Marco; van den Berg, Bob; Malic, Alexander; van Oppen, Claudia; Towend, David; Dekker, Andre; Dumontier, Michel
2018-01-01
Conventional data mining algorithms are unable to satisfy the current requirements on analyzing big data in some fields such as medicine, policy making, judicial, and tax records. However, applying diverse datasets from different institutes (both healthcare and non-healthcare related) can enrich information and insights. So far, analyzing this data in an automated, privacy-preserving manner does not exist to our knowledge. In this work, we propose an infrastructure, and proof-of-concept for privacy-preserving analytics on vertically partitioned data.
Piva, Elisa; Tosato, Francesca; Plebani, Mario
2015-12-07
Most errors in laboratory medicine occur in the pre-analytical phase of the total testing process. Phlebotomy, a crucial step in the pre-analytical phase influencing laboratory results and patient outcome, calls for quality assurance procedures and automation in order to prevent errors and ensure patient safety. We compared the performance of a new small, automated device, the ProTube Inpeco, designed for use in phlebotomy with a complete traceability of the process, with a centralized automated system, BC ROBO. ProTube was used for 15,010 patients undergoing phlebotomy with 48,776 tubes being labeled. The mean time and standard deviation (SD) for blood sampling was 3:03 (min:sec; SD ± 1:24) when using ProTube, against 5:40 (min:sec; SD ± 1:57) when using BC ROBO. The mean number of patients per hour managed at each phlebotomy point was 16 ± 3 with ProTube, and 10 ± 2 with BC ROBO. No tubes were labeled erroneously or incorrectly, even if process failure occurred in 2.8% of cases when ProTube was used. Thanks to its cutting edge technology, the ProTube has many advantages over BC ROBO, above all in verifying patient identity, and in allowing a reduction in both identification error and tube mislabeling.
A new automated turbidimetric immunoassay for the measurement of canine C-reactive protein.
Piñeiro, Matilde; Pato, Raquel; Soler, Lourdes; Peña, Raquel; García, Natalia; Torrente, Carlos; Saco, Yolanda; Lampreave, Fermín; Bassols, Anna; Canalias, Francesca
2018-03-01
In dogs, as in humans, C-reactive protein (CRP) is a major acute phase protein that is rapidly and prominently increased after exposure to inflammatory stimuli. CRP measurements are used in the diagnosis and monitoring of infectious and inflammatory diseases. The study aim was to develop and validate a turbidimetric immunoassay for the quantification of canine CRP (cCRP), using canine-specific reagents and standards. A particle-enhanced turbidimetric immunoassay was developed. The assay was set up in a fully automated analyzer, and studies of imprecision, limits of linearity, limits of detection, prozone effects, and interferences were carried out. The new method was compared with 2 other commercially available automated immunoassays for cCRP: one turbidimetric immunoassay (Gentian CRP) and one point-of-care assay based on magnetic permeability (Life Assays CRP). The within-run and between-day imprecision were <1.7% and 4.2%, respectively. The assay quantified CRP proportionally in an analytic range up to 150 mg/L, with a prozone effect appearing at cCRP concentrations >320 mg/L. No interference from hemoglobin (20 g/L), triglycerides (10 g/L), or bilirubin (150 mg/L) was detected. Good agreement was observed between the results obtained with the new method and the Gentian cCRP turbidimetric immunoassay. The new turbidimetric immunoassay (Turbovet canine CRP, Acuvet Biotech) is a rapid, robust, precise, and accurate method for the quantification of cCRP. The method can be easily set up in automated analyzers, providing a suitable tool for routine clinical use. © 2018 American Society for Veterinary Clinical Pathology.
NASA Astrophysics Data System (ADS)
Brassard, D.; Clime, L.; Daoud, J.; Geissler, M.; Malic, L.; Charlebois, D.; Buckley, N.; Veres, T.
2018-02-01
An innovative centrifugal microfluidic universal platform for remote bio-analytical assays automation required in life-sciences research and medical applications, including purification and analysis from body fluids of cellular and circulating markers.
Biopharmaceutical production: Applications of surface plasmon resonance biosensors.
Thillaivinayagalingam, Pranavan; Gommeaux, Julien; McLoughlin, Michael; Collins, David; Newcombe, Anthony R
2010-01-15
Surface plasmon resonance (SPR) permits the quantitative analysis of therapeutic antibody concentrations and impurities including bacteria, Protein A, Protein G and small molecule ligands leached from chromatography media. The use of surface plasmon resonance has gained popularity within the biopharmaceutical industry due to the automated, label free, real time interaction that may be exploited when using this method. The application areas to assess protein interactions and develop analytical methods for biopharmaceutical downstream process development, quality control, and in-process monitoring are reviewed. 2009 Elsevier B.V. All rights reserved.
[Point of Care 2.0: Coagulation Monitoring Using Rotem® Sigma and Teg® 6S].
Weber, Christian Friedrich; Zacharowski, Kai
2018-06-01
New-generation methods for point of care based coagulation monitoring enable fully automated viscoelastic analyses for the assessment of particular parts of hemostasis. Contrary to the measuring techniques of former models, the viscoelastic ROTEM ® sigma and TEG ® 6s analyses are performed in single-use test cartridges without time- and personnel-intensive pre-analytical procedures. This review highlights methodical strengths and limitations of the devices and meets concerns associated with their integration in routine clinical practice. Georg Thieme Verlag KG Stuttgart · New York.
An automated real-time free phenytoin assay to replace the obsolete Abbott TDx method.
Williams, Christopher; Jones, Richard; Akl, Pascale; Blick, Kenneth
2014-01-01
Phenytoin is a commonly used anticonvulsant that is highly protein bound with a narrow therapeutic range. The unbound fraction, free phenytoin (FP), is responsible for pharmacologic effects; therefore, it is essential to measure both FP and total serum phenytoin levels. Historically, the Abbott TDx method has been widely used for the measurement of FP and was the method used in our laboratory. However, the FP TDx assay was recently discontinued by the manufacturer, so we had to develop an alternative methodology. We evaluated the Beckman-Coulter DxC800 based FP method for linearity, analytical sensitivity, and precision. The analytical measurement range of the method was 0.41 to 5.30 microg/mL. Within-run and between-run precision studies yielded CVs of 3.8% and 5.5%, respectively. The method compared favorably with the TDx method, yielding the following regression equation: DxC800 = 0.9**TDx + 0.10; r2 = 0.97 (n = 97). The new FP assay appears to be an acceptable alternative to the TDx method.
NASA Astrophysics Data System (ADS)
Mednova, Olga; Kirsanov, Dmitry; Rudnitskaya, Alisa; Kilmartin, Paul; Legin, Andrey
2009-05-01
The present study deals with a potentiometric electronic tongue (ET) multisensor system applied for the simultaneous determination of several chemical parameters for white wines produced in New Zealand. Methods in use for wine quality control are often expensive and require considerable time and skilled operation. The ET approach usually offers a simple and fast measurement protocol and allows automation for on-line analysis under industrial conditions. The ET device developed in this research is capable of quantifying the free and total SO2 content, total acids and some polyphenolic compounds in white wines with acceptable analytical errors.
Microfluidic-Based Robotic Sampling System for Radioactive Solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack D. Law; Julia L. Tripp; Tara E. Smith
A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less
Yeung, Joanne Chung Yan; de Lannoy, Inés; Gien, Brad; Vuckovic, Dajana; Yang, Yingbo; Bojko, Barbara; Pawliszyn, Janusz
2012-09-12
In vivo solid-phase microextraction (SPME) can be used to sample the circulating blood of animals without the need to withdraw a representative blood sample. In this study, in vivo SPME in combination with liquid-chromatography tandem mass spectrometry (LC-MS/MS) was used to determine the pharmacokinetics of two drug analytes, R,R-fenoterol and R,R-methoxyfenoterol, administered as 5 mg kg(-1) i.v. bolus doses to groups of 5 rats. This research illustrates, for the first time, the feasibility of the diffusion-based calibration interface model for in vivo SPME studies. To provide a constant sampling rate as required for the diffusion-based interface model, partial automation of the SPME sampling of the analytes from the circulating blood was accomplished using an automated blood sampling system. The use of the blood sampling system allowed automation of all SPME sampling steps in vivo, except for the insertion and removal of the SPME probe from the sampling interface. The results from in vivo SPME were compared to the conventional method based on blood withdrawal and sample clean up by plasma protein precipitation. Both whole blood and plasma concentrations were determined by the conventional method. The concentrations of methoxyfenoterol and fenoterol obtained by SPME generally concur with the whole blood concentrations determined by the conventional method indicating the utility of the proposed method. The proposed diffusion-based interface model has several advantages over other kinetic calibration models for in vivo SPME sampling including (i) it does not require the addition of a standard into the sample matrix during in vivo studies, (ii) it is simple and rapid and eliminates the need to pre-load appropriate standard onto the SPME extraction phase and (iii) the calibration constant for SPME can be calculated based on the diffusion coefficient, extraction time, fiber length and radius, and size of the boundary layer. In the current study, the experimental calibration constants of 338.9±30 mm(-3) and 298.5±25 mm(-3) are in excellent agreement with the theoretical calibration constants of 307.9 mm(-3) and 316.0 mm(-3) for fenoterol and methoxyfenoterol respectively. Copyright © 2012 Elsevier B.V. All rights reserved.
Quantification of six herbicide metabolites in human urine.
Norrgran, Jessica; Bravo, Roberto; Bishop, Amanda M; Restrepo, Paula; Whitehead, Ralph D; Needham, Larry L; Barr, Dana B
2006-01-18
We developed a sensitive, selective and precise method for measuring herbicide metabolites in human urine. Our method uses automated liquid delivery of internal standards and acetate buffer and a mixed polarity polymeric phase solid phase extraction of a 2 mL urine sample. The concentrated eluate is analyzed using high-performance liquid chromatography-tandem mass spectrometry. Isotope dilution calibration is used for quantification of all analytes. The limits of detection of our method range from 0.036 to 0.075 ng/mL. The within- and between-day variation in pooled quality control samples range from 2.5 to 9.0% and from 3.2 to 16%, respectively, for all analytes at concentrations ranging from 0.6 to 12 ng/mL. Precision was similar with samples fortified with 0.1 and 0.25 ng/mL that were analyzed in each run. We validated our selective method against a less selective method used previously in our laboratory by analyzing human specimens using both methods. The methods produced results that were in agreement, with no significant bias observed.
Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang
2012-02-07
Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society
Tarumi, Toshiyasu; Small, Gary W; Combs, Roger J; Kroutil, Robert T
2004-04-01
Finite impulse response (FIR) filters and finite impulse response matrix (FIRM) filters are evaluated for use in the detection of volatile organic compounds with wide spectral bands by direct analysis of interferogram data obtained from passive Fourier transform infrared (FT-IR) measurements. Short segments of filtered interferogram points are classified by support vector machines (SVMs) to implement the automated detection of heated plumes of the target analyte, ethanol. The interferograms employed in this study were acquired with a downward-looking passive FT-IR spectrometer mounted on a fixed-wing aircraft. Classifiers are trained with data collected on the ground and subsequently used for the airborne detection. The success of the automated detection depends on the effective removal of background contributions from the interferogram segments. Removing the background signature is complicated when the analyte spectral bands are broad because there is significant overlap between the interferogram representations of the analyte and background. Methods to implement the FIR and FIRM filters while excluding background contributions are explored in this work. When properly optimized, both filtering procedures provide satisfactory classification results for the airborne data. Missed detection rates of 8% or smaller for ethanol and false positive rates of at most 0.8% are realized. The optimization of filter design parameters, the starting interferogram point for filtering, and the length of the interferogram segments used in the pattern recognition is discussed.
Ates, E; Mittendorf, K; Stroka, J; Senyuva, H
2013-01-01
An automated method involving on-line clean-up and analytical separation in a single run using TurboFlow™ reversed phase liquid chromatography coupled to a high resolution mass spectrometer has been developed for the simultaneous determination of deoxynivalenol, T2 toxin, HT2 toxin, zearalenone and fumonisins B1 and B2 in maize, wheat and animal feed. Detection was performed in full scan mode at a resolution of R = 100,000 full width at half maximum with high energy collision cell dissociation for the determination of fragment ions with a mass accuracy below 5 ppm. The extract from homogenised samples, after blending with a 0.1% aqueous mixture of 0.1% formic acid/acetonitrile (43:57) for 45 min, was injected directly onto the TurboFlow™ (TLX) column for automated on-line clean-up followed by analytical separation and accurate mass detection. The TurboFlow™ column enabled specific binding of target mycotoxins, whereas higher molecular weight compounds, like fats, proteins and other interferences with different chemical properties, were removed to waste. Single laboratory method validation was performed by spiking blank materials with mycotoxin standards. The recovery and repeatability was determined by spiking at three concentration levels (50, 100 and 200% of legislative limits) with six replicates. Average recovery, relative standard deviation and intermediate precision values were 71 to 120%, 1 to 19% and 4 to 19%, respectively. The method accuracy was confirmed with certified reference materials and participation in proficiency testing.
Lee, Wonmok; Ha, Jung-Sook; Ryoo, Nam-Hee
2016-09-01
The cobas u 701, a new automated image-based urine sediment analyzer, was introduced recently. In this study, we compared its performance with that of UF-1000i flow cytometry and manual microscopy in the examination of urine sediments. Precision, linearity, and carry-over were determined for the two urine sediment analyzers. For a comparison of the method, 300 urine samples were examined by the automated analyzers and by manual microscopy using a KOVA chamber. Within-run coefficients of variation (CVs) for the control materials were 7.0-8.8% and 1.7-5.7% for the cobas u 701 and UF-1000i systems, respectively. Between-run CVs were 8.5-9.8% and 2.7-5.4%, respectively. Both instruments showed good linearity and negligible carry-over. For red blood cells (RBC), white blood cells (WBC), and epithelial cells (EPI), the overall concordance rates within one grade of difference among the three methods were good (78.6-86.0%, 88.7-93.8%, and 81.3-90.7%, respectively). The concordance rate for casts was poor (66.5-68.9%). Compared with manual microscopy, the two automated sediment analyzers tested in this study showed satisfactory analytical performances for RBC, WBC, and EPI. However, for other urine sediment particles confirmation by visual microscopy is still required. © 2016 Wiley Periodicals, Inc.
Merlos Rodrigo, Miguel Angel; Krejcova, Ludmila; Kudr, Jiri; Cernei, Natalia; Kopel, Pavel; Richtera, Lukas; Moulick, Amitava; Hynek, David; Adam, Vojtech; Stiborova, Marie; Eckschlager, Tomas; Heger, Zbynek; Zitka, Ondrej
2016-12-15
Metallothioneins (MTs) are involved in heavy metal detoxification in a wide range of living organisms. Currently, it is well known that MTs play substantial role in many pathophysiological processes, including carcinogenesis, and they can serve as diagnostic biomarkers. In order to increase the applicability of MT in cancer diagnostics, an easy-to-use and rapid method for its detection is required. Hence, the aim of this study was to develop a fully automated and high-throughput assay for the estimation of MT levels. Here, we report the optimal conditions for the isolation of MTs from rabbit liver and their characterization using MALDI-TOF MS. In addition, we described a two-step assay, which started with an isolation of the protein using functionalized paramagnetic particles and finished with their electrochemical analysis. The designed easy-to-use, cost-effective, error-free and fully automated procedure for the isolation of MT coupled with a simple analytical detection method can provide a prototype for the construction of a diagnostic instrument, which would be appropriate for the monitoring of carcinogenesis or MT-related chemoresistance of tumors. Copyright © 2016 Elsevier B.V. All rights reserved.
Principles of Automation for Patient Safety in Intensive Care: Learning From Aviation.
Dominiczak, Jason; Khansa, Lara
2018-06-01
The transition away from written documentation and analog methods has opened up the possibility of leveraging data science and analytic techniques to improve health care. In the implementation of data science techniques and methodologies, high-acuity patients in the ICU can particularly benefit. The Principles of Automation for Patient Safety in Intensive Care (PASPIC) framework draws on Billings's principles of human-centered aviation (HCA) automation and helps in identifying the advantages, pitfalls, and unintended consequences of automation in health care. Billings's HCA principles are based on the premise that human operators must remain "in command," so that they are continuously informed and actively involved in all aspects of system operations. In addition, automated systems need to be predictable, simple to train, to learn, and to operate, and must be able to monitor the human operators, and every intelligent system element must know the intent of other intelligent system elements. In applying Billings's HCA principles to the ICU setting, PAPSIC has three key characteristics: (1) integration and better interoperability, (2) multidimensional analysis, and (3) enhanced situation awareness. PAPSIC suggests that health care professionals reduce overreliance on automation and implement "cooperative automation" and that vendors reduce mode errors and embrace interoperability. Much can be learned from the aviation industry in automating the ICU. Because it combines "smart" technology with the necessary controls to withstand unintended consequences, PAPSIC could help ensure more informed decision making in the ICU and better patient care. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Human Papillomavirus (HPV) Genotyping: Automation and Application in Routine Laboratory Testing
Torres, M; Fraile, L; Echevarria, JM; Hernandez Novoa, B; Ortiz, M
2012-01-01
A large number of assays designed for genotyping human papillomaviruses (HPV) have been developed in the last years. They perform within a wide range of analytical sensitivity and specificity values for the different viral types, and are used either for diagnosis, epidemiological studies, evaluation of vaccines and implementing and monitoring of vaccination programs. Methods for specific genotyping of HPV-16 and HPV-18 are also useful for the prevention of cervical cancer in screening programs. Some commercial tests are, in addition, fully or partially automated. Automation of HPV genotyping presents advantages such as the simplicity of the testing procedure for the operator, the ability to process a large number of samples in a short time, and the reduction of human errors from manual operations, allowing a better quality assurance and a reduction of cost. The present review collects information about the current HPV genotyping tests, with special attention to practical aspects influencing their use in clinical laboratories. PMID:23248734
Unmanned Mine of the 21st Centuries
NASA Astrophysics Data System (ADS)
Semykina, Irina; Grigoryev, Aleksandr; Gargayev, Andrey; Zavyalov, Valeriy
2017-11-01
The article is analytical. It considers the construction principles of the automation system structure which realize the concept of «unmanned mine». All of these principles intend to deal with problems caused by a continuous complication of mining-and-geological conditions at coalmine such as the labor safety and health protection, the weak integration of different mining automation subsystems and the deficiency of optimal balance between a quantity of resource and energy consumed by mining machines and their throughput. The authors describe the main problems and neck stage of mining machines autonomation and automation subsystem. The article makes a general survey of the applied «unmanned technology» in the field of mining such as the remotely operated autonomous complexes, the underground positioning systems of mining machines using infrared radiation in mine workings etc. The concept of «unmanned mine» is considered with an example of the robotic road heading machine. In the final, the authors analyze the techniques and methods that could solve the task of underground mining without human labor.
Saleh, Lanja; Mueller, Daniel; von Eckardstein, Arnold
2016-04-01
We evaluated the analytical and clinical performance of the new Lumipulse® G 25-OH vitamin D assay from Fujirebio, and compared it to a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method and three other commercial automated assays. Total 25 hydroxy vitamin D (25(OH)D) levels were measured in 100 selected serum samples from our routine analysis with Fujirebio 25(OH)D assay. The results were compared with those obtained with LC-MS/MS and three other automated 25(OH)D assays (Abbott, Beckman, and Roche). The accuracy of each assay tested was evaluated against a Labquality reference serum panel for 25(OH)D (Ref!25OHD; University of Ghent). Intra- and inter-day imprecision of the Fujirebio 25(OH)D assay was <5%. Fujirebio 25(OH)D assay showed the highest correlation among the assays tested with the LC-MS/MS method (R=0.986). The mean relative bias obtained was -15.6% (Fujirebio), -12.7% (Beckman), -2.1% (Abbott) and 9.7% (Roche) as compared to LC-MS/MS. Comparison with the Labquality certified reference serum panel yielded a mean bias of -11.8% (Fujirebio), -14.1% (Beckman), 4.4% (Abbott) and 3.2% (Roche), respectively. Compared to LC-MS/MS, the sensitivity of different methods in detecting vitamin D insufficiency (<50 nmol/L) varied from 100% for the Fujirebio assay to 72.7% for Roche, and specificity ranged from 94.4% for Roche to 87.6% for Beckman. The Lumipulse G 25-OH vitamin D assay from Fujirebio demonstrated a good correlation with LC-MS/MS and some immunoassays. The performance of the assay is well-suited for routine 25(OH)D measurement in clinical serum samples. A correction for the observed negative bias vs. LC-MS/MS could be considered.
Lehmann, Sabrina; Kieliba, Tobias; Beike, Justus; Thevis, Mario; Mercer-Chalmers-Bender, Katja
2017-10-01
A detailed description is given of the development and validation of a fully automated in-line solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS) method capable of detecting 90 central-stimulating new psychoactive substances (NPS) and 5 conventional amphetamine-type stimulants (amphetamine, 3,4-methylenedioxy-methamphetamine (MDMA), 3,4-methylenedioxy-amphetamine (MDA), 3,4-methylenedioxy-N-ethyl-amphetamine (MDEA), methamphetamine) in serum. The aim was to apply the validated method to forensic samples. The preparation of 150μL of serum was performed by an Instrument Top Sample Preparation (ITSP)-SPE with mixed mode cation exchanger cartridges. The extracts were directly injected into an LC-MS/MS system, using a biphenyl column and gradient elution with 2mM ammonium formate/0.1% formic acid and acetonitrile/0.1% formic acid as mobile phases. The chromatographic run time amounts to 9.3min (including re-equilibration). The total cycle time is 11min, due to the interlacing between sample preparation and analysis. The method was fully validated using 69 NPS and five conventional amphetamine-type stimulants, according to the guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh). The guidelines were fully achieved for 62 analytes (with a limit of detection (LOD) between 0.2 and 4μg/L), whilst full validation was not feasible for the remaining 12 analytes. For the fully validated analytes, the method achieved linearity in the 5μg/L (lower limit of quantification, LLOQ) to 250μg/L range (coefficients of determination>0.99). Recoveries for 69 of these compounds were greater than 50%, with relative standard deviations≤15%. The validated method was then tested for its capability in detecting a further 21 NPS, thus totalling 95 tested substances. An LOD between 0.4 and 1.6μg/L was obtained for these 21 additional qualitatively-measured substances. The method was subsequently successfully applied to 28 specimens from routine forensic case work, of which 7 samples were determined to be positive for NPS consumption. Copyright © 2017 Elsevier B.V. All rights reserved.
Advances in aptamer screening and small molecule aptasensors.
Kim, Yeon Seok; Gu, Man Bock
2014-01-01
It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.
Muto, Satoru; Sugiura, Syo-Ichiro; Nakajima, Akiko; Horiuchi, Akira; Inoue, Masahiro; Saito, Keisuke; Isotani, Shuji; Yamaguchi, Raizo; Ide, Hisamitsu; Horie, Shigeo
2014-10-01
We aimed to identify patients with a chief complaint of hematuria who could safely avoid unnecessary radiation and instrumentation in the diagnosis of bladder cancer (BC), using automated urine flow cytometry to detect isomorphic red blood cells (RBCs) in urine. We acquired urine samples from 134 patients over the age of 35 years with a chief complaint of hematuria and a positive urine occult blood test or microhematuria. The data were analyzed using the UF-1000i (®) (Sysmex Co., Ltd., Kobe, Japan) automated urine flow cytometer to determine RBC morphology, which was classified as isomorphic or dysmorphic. The patients were divided into two groups (BC versus non-BC) for statistical analysis. Multivariate logistic regression analysis was used to determine the predictive value of flow cytometry versus urine cytology, the bladder tumor antigen test, occult blood in urine test, and microhematuria test. BC was confirmed in 26 of 134 patients (19.4 %). The area under the curve for RBC count using the automated urine flow cytometer was 0.94, representing the highest reference value obtained in this study. Isomorphic RBCs were detected in all patients in the BC group. On multivariate logistic regression analysis, only isomorphic RBC morphology was significantly predictive for BC (p < 0.001). Analytical parameters such as sensitivity, specificity, positive predictive value, and negative predictive value of isomorphic RBCs in urine were 100.0, 91.7, 74.3, and 100.0 %, respectively. Detection of urinary isomorphic RBCs using automated urine flow cytometry is a reliable method in the diagnosis of BC with hematuria.
1998 Technology Showcase. JOAP International Condition Monitoring Conference.
1998-04-01
Systems using Automated SEM/ EDX and New Diagnostic Routines 276 N. W Farrant & T. Luckhurst ADVANCED DIAGNOSTIC SYSTEMS Model-Based Diagnostics of Gas...Microscopy with Energy Dispersive X-Ray (SEM/ EDX ) micro analysis packages and Energy Dispersive X-Ray Fluorescence (EDXRF) analytical equipment. Therqfore...wear particles separated by ferrogram method. a- I WEAR PARTICLE A SLAS 97 (HOME PAGE) Fig I Home Page NONFE;RROUS MATERIAL A wW~ a48 -1, rV fr , ý b
Fast targeted analysis of 132 acidic and neutral drugs and poisons in whole blood using LC-MS/MS.
Di Rago, Matthew; Saar, Eva; Rodda, Luke N; Turfus, Sophie; Kotsos, Alex; Gerostamoulos, Dimitri; Drummer, Olaf H
2014-10-01
The aim of this study was to develop an LC-MS/MS based screening technique that covers a broad range of acidic and neutral drugs and poisons by combining a small sample volume and efficient extraction technique with simple automated data processing. After protein precipitation of 100μL of whole blood, 132 common acidic and neutral drugs and poisons including non-steroidal anti-inflammatory drugs, barbiturates, anticonvulsants, antidiabetics, muscle relaxants, diuretics and superwarfarin rodenticides (47 quantitated, 85 reported as detected) were separated using a Shimadzu Prominence HPLC system with a C18 separation column (Kinetex XB-C18, 4.6mm×150mm, 5μm), using gradient elution with a mobile phase of 25mM ammonium acetate buffer (pH 7.5)/acetonitrile. The drugs were detected using an ABSciex(®) API 2000 LC-MS/MS system (ESI+ and -, MRM mode, two transitions per analyte). The method was fully validated in accordance with international guidelines. Quantification data obtained using one-point calibration compared favorably to that using multiple calibrants. The presented LC-MS/MS assay has proven to be applicable for determination of the analytes in blood. The fast and reliable extraction method combined with automated processing gives the opportunity for high throughput and fast turnaround times for forensic and clinical toxicology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Bladergroen, Marco R.; van der Burgt, Yuri E. M.
2015-01-01
For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071
Sampson, Maureen M.; Chambers, David M.; Pazo, Daniel Y.; Moliere, Fallon; Blount, Benjamin C.; Watson, Clifford H.
2015-01-01
Quantifying volatile organic compounds (VOCs) in cigarette smoke is necessary to establish smoke-related exposure estimates and evaluate emerging products and potential reduced-exposure products. In response to this need, we developed an automated, multi-VOC quantification method for machine-generated, mainstream cigarette smoke using solidphase microextraction gas chromatography–mass spectrometry (SPME-GC–MS). This method was developed to simultaneously quantify a broad range of smoke VOCs (i.e., carbonyls and volatiles, which historically have been measured by separate assays) for large exposure assessment studies. Our approach collects and maintains vapor-phase smoke in a gas sampling bag, where it is homogenized with isotopically labeled analogue internal standards and sampled using gas-phase SPME. High throughput is achieved by SPME automation using a CTC Analytics platform and custom bag tray. This method has successfully quantified 22 structurally diverse VOCs (e.g., benzene and associated monoaromatics, aldehydes and ketones, furans, acrylonitrile, 1,3-butadiene, vinyl chloride, and nitromethane) in the microgram range in mainstream smoke from 1R5F and 3R4F research cigarettes smoked under ISO (Cambridge Filter or FTC) and Intense (Health Canada or Canadian Intense) conditions. Our results are comparable to previous studies with few exceptions. Method accuracy was evaluated with third-party reference samples (≤15% error). Short-term diffusion losses from the gas sampling bag were minimal, with a 10% decrease in absolute response after 24 h. For most analytes, research cigarette inter- and intrarun precisions were ≤20% relative standard deviation (RSD). This method provides an accurate and robust means to quantify VOCs in cigarette smoke spanning a range of yields that is sufficient to characterize smoke exposure estimates. PMID:24933649
Hunsaker, Joshua J H; Wyness, Sara P; Snow, Taylor M; Genzen, Jonathan R
2016-12-01
Refractometric methods to measure total protein (TP) in serum and plasma specimens have been replaced by automated biuret methods in virtually all routine clinical testing. A subset of laboratories, however, still report using refractometry to measure TP in conjunction with serum protein electrophoresis. The objective of this study was therefore to conduct a modern performance evaluation of a digital refractometer for TP measurement. Performance evaluation of a MISCO Palm Abbe™ digital refractometer was conducted through device familiarization, carryover, precision, accuracy, linearity, analytical sensitivity, analytical specificity, and reference interval verification. Comparison assays included a manual refractometer and an automated biuret assay. Carryover risk was eliminated using a demineralized distilled water (ddH 2 O) wash step. Precision studies demonstrated overall imprecision of 2.2% CV (low TP pool) and 0.5% CV (high TP pool). Accuracy studies demonstrated correlation to both manual refractometry and the biuret method. An overall positive bias (+5.0%) was observed versus the biuret method. On average, outlier specimens had an increased triglyceride concentration. Linearity was verified using mixed dilutions of: a) low and high concentration patient pools, or b) albumin-spiked ddH 2 O and high concentration patient pool. Decreased recovery was observed using ddH 2 O dilutions at low TP concentrations. Significant interference was detected at high concentrations of glucose (>267 mg/dL) and triglycerides (>580 mg/dL). Current laboratory reference intervals for TP were verified. Performance characteristics of this digital refractometer were validated in a clinical laboratory setting. Biuret method remains the preferred assay for TP measurement in routine clinical analyses.
Campestrini, J; Lecaillon, J B; Godbillon, J
1997-12-19
An automated high-performance liquid chromatography (HPLC) method for the determination of formoterol in human plasma with improved sensitivity has been developed and validated. Formoterol and CGP 47086, the internal standard, were extracted from plasma (1 ml) using a cation-exchange solid-phase extraction (SPE) cartridge. The compounds were eluted with pH 6 buffer solution-methanol (70:30, v/v) and the eluate was further diluted with water. An aliquot of the extract solution was injected and analyzed by HPLC. The extraction, dilution, injection and chromatographic analysis were combined and automated using the automate (ASPEC) system. The chromatographic separations were achieved on a 5 microm, Hypersil ODS analytical column (200 mm x 3 mm I.D.), using (pH 6 phosphate buffer, 0.035 M + 20 mg/l EDTA)-MeOH-CH3CN (70:25:5, v/v/v) as the mobile phase at a flow-rate of 0.4 ml/min. The analytes were detected with electrochemical detection at an operating potential of +0.63 V. Intra-day accuracy and precision were assessed from the relative recoveries of calibration/quality control plasma samples in the concentration range of 7.14 to 238 pmol/l of formoterol base. The accuracy over the entire concentration range varied from 81 to 105%, and the precision (C.V.) ranged from 3 to 14%. Inter-day accuracy and precision were assessed in the concentration range of 11.9 to 238 pmol/l of formoterol base in plasma. The accuracy over the entire concentration range varied from 98 to 109%, and precision ranged from 8 to 19%. At the limit of quantitation (LOQ) of 11.9 pmol/l for inter-day measurements, the recovery value was 109% and C.V. was 19%. As shown from intra-day accuracy and precision results, favorable conditions (a newly used column, a newly washed detector cell and moderate residual cell current level) allowed us to reach a LOQ of 7.14 pmol/l of formoterol base (3 pg/ml of formoterol fumarate dihydrate). Improvement of the limit of detection by a factor of about 10 was reached as compared to the previously described methods. The method has been applied for quantifying formoterol in plasma after 120 microg drug inhalation to volunteers. Formoterol was still measurable at 24 h post-dosing in most subjects and a slow elimination of formoterol from plasma beyond 6-8 h after inhalation was demonstrated for the first time thanks to the sensitivity of the method.
Universal electronics for miniature and automated chemical assays.
Urban, Pawel L
2015-02-21
This minireview discusses universal electronic modules (generic programmable units) and their use by analytical chemists to construct inexpensive, miniature or automated devices. Recently, open-source platforms have gained considerable popularity among tech-savvy chemists because their implementation often does not require expert knowledge and investment of funds. Thus, chemistry students and researchers can easily start implementing them after a few hours of reading tutorials and trial-and-error. Single-board microcontrollers and micro-computers such as Arduino, Teensy, Raspberry Pi or BeagleBone enable collecting experimental data with high precision as well as efficient control of electric potentials and actuation of mechanical systems. They are readily programmed using high-level languages, such as C, C++, JavaScript or Python. They can also be coupled with mobile consumer electronics, including smartphones as well as teleinformatic networks. More demanding analytical tasks require fast signal processing. Field-programmable gate arrays enable efficient and inexpensive prototyping of high-performance analytical platforms, thus becoming increasingly popular among analytical chemists. This minireview discusses the advantages and drawbacks of universal electronic modules, considering their application in prototyping and manufacture of intelligent analytical instrumentation.
Trust in automation: designing for appropriate reliance.
Lee, John D; See, Katrina A
2004-01-01
Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.
Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L
2016-01-15
Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis of the neurotoxin anisatin in star anise by LC-MS/MS.
Mathon, Caroline; Bongard, Benjamin; Duret, Monique; Ortelli, Didier; Christen, Philippe; Bieri, Stefan
2013-01-01
The aim of this work was to develop an analytical method capable of determining the presence of anisatin in star anise. This neurotoxin may induce severe side effects such as epileptic convulsions. It is therefore of prime importance to have rapid and accurate analytical methods able to detect and quantify anisatin in samples that are purportedly edible star anise. The sample preparation combined an automated accelerated solvent extraction with a solid-supported liquid-liquid purification step on EXtrelut®. Samples were analysed on a porous graphitic carbon HPLC column and quantified by tandem mass spectrometry operating in the negative ionisation mode. The quantification range of anisatin was between 0.2 and 8 mg kg⁻¹. The applicability of this validated method was demonstrated by the analysis of several Illicium species and star anise samples purchased on the Swiss market. High levels of anisatin were measured in Illicium lanceolatum, I. majus and I. anisatum, which may cause health concerns if they are misidentified or mixed with edible Illicium verum.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Micro-optics for microfluidic analytical applications.
Yang, Hui; Gijs, Martin A M
2018-02-19
This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.
Agopian, A J; Evans, Jane A; Lupo, Philip J
2018-01-15
It is estimated that 20 to 30% of infants with birth defects have two or more birth defects. Among these infants with multiple congenital anomalies (MCA), co-occurring anomalies may represent either chance (i.e., unrelated etiologies) or pathogenically associated patterns of anomalies. While some MCA patterns have been recognized and described (e.g., known syndromes), others have not been identified or characterized. Elucidating these patterns may result in a better understanding of the etiologies of these MCAs. This article reviews the literature with regard to analytic methods that have been used to evaluate patterns of MCAs, in particular those using birth defect registry data. A popular method for MCA assessment involves a comparison of the observed to expected ratio for a given combination of MCAs, or one of several modified versions of this comparison. Other methods include use of numerical taxonomy or other clustering techniques, multiple regression analysis, and log-linear analysis. Advantages and disadvantages of these approaches, as well as specific applications, were outlined. Despite the availability of multiple analytic approaches, relatively few MCA combinations have been assessed. The availability of large birth defects registries and computing resources that allow for automated, big data strategies for prioritizing MCA patterns may provide for new avenues for better understanding co-occurrence of birth defects. Thus, the selection of an analytic approach may depend on several considerations. Birth Defects Research 110:5-11, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
[Automation and organization of technological process of urinalysis].
Kolenkin, S M; Kishkun, A A; Kol'chenko, O L
2000-12-01
Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.
Lomnitz, Jason G.; Savageau, Michael A.
2016-01-01
Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count, and a negative channel that decreases the count. This example shows the power of these new automated methods to rapidly identify behaviors of interest and efficiently predict parameter values for their realization. These tools may be applied to understand complex natural circuitry and to aid in the rational design of synthetic circuits. PMID:27462346
Quality of Big Data in health care.
Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K
2015-01-01
The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.
Cray, Carolyn; Dickey, Meranda; Brewer, Leah Brinson; Arheart, Kristopher L
2013-12-01
The acute phase protein serum amyloid A (SAA) has been previously shown to have value as a biomarker of inflammation and infection in many species, including manatees (Trichechus manatus latirostris). In the current study, results from an automated assay for SAA were used in a rehabilitation setting. Reference intervals were established from clinically normal manatees using the robust method: 0-46 mg/L. More than 30-fold higher mean SAA levels were observed in manatees suffering from cold stress and boat-related trauma. Poor correlations were observed between SAA and total white blood count, percentage of neutrophils, albumin, and albumin/globulin ratio. A moderate correlation was observed between SAA and the presence of nucleated red blood cells. The sensitivity of SAA testing was 93% and the specificity was 98%, representing the highest combined values of all the analytes. The results indicate that the automated method for SAA quantitation can provide important clinical data for manatees in a rehabilitation setting.
Pistón, Mariela; Mollo, Alicia; Knochen, Moisés
2011-01-01
A fast and efficient automated method using a sequential injection analysis (SIA) system, based on the Griess, reaction was developed for the determination of nitrate and nitrite in infant formulas and milk powder. The system enables to mix a measured amount of sample (previously constituted in the liquid form and deproteinized) with the chromogenic reagent to produce a colored substance whose absorbance was recorded. For nitrate determination, an on-line prereduction step was added by passing the sample through a Cd minicolumn. The system was controlled from a PC by means of a user-friendly program. Figures of merit include linearity (r2 > 0.999 for both analytes), limits of detection (0.32 mg kg−1 NO3-N, and 0.05 mg kg−1 NO2-N), and precision (sr%) 0.8–3.0. Results were statistically in good agreement with those obtained with the reference ISO-IDF method. The sampling frequency was 30 hour−1 (nitrate) and 80 hour−1 (nitrite) when performed separately. PMID:21960750
2013-11-27
SECURITY CLASSIFICATION OF: CUBRC has developed an in-line, multi-analyte isolation technology that utilizes solid phase extraction chemistries to purify...goals. Specifically, CUBRC will design and manufacture a prototype cartridge(s) and test the prototype cartridge for its ability to isolate each...display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. CUBRC , Inc. P. O. Box 400 Buffalo, NY 14225 -1955
Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey
2018-03-16
Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.
Definition of performance specifications for automated Analytical Electrophoresis Facility (AAEF)
NASA Technical Reports Server (NTRS)
Brooks, D. E.
1976-01-01
In order to provide specifications for the automated Analytical Electrophoresis Facility (AAEF) that would satisfy the broadest variety of demands of a future user community, a survey was carried out of all those people who were identified as having published papers on cell electrophoresis in the past four years. A computer search was conducted of the relevant literature from which a list of 87 investigators was derived and defined as the user community for purposes of the mailing. A questionnaire was developed covering the areas of performance which required definition which was subsequently circulated to the user community. Based on the response to this survey performance specifications were assembled.
Shao, Limin; Griffiths, Peter R; Leytem, April B
2010-10-01
The automated quantification of three greenhouse gases, ammonia, methane, and nitrous oxide, in the vicinity of a large dairy farm by open-path Fourier transform infrared (OP/FT-IR) spectrometry at intervals of 5 min is demonstrated. Spectral pretreatment, including the automated detection and correction of the effect of interrupting the infrared beam, is by a moving object, and the automated correction for the nonlinear detector response is applied to the measured interferograms. Two ways of obtaining quantitative data from OP/FT-IR data are described. The first, which is installed in a recently acquired commercial OP/FT-IR spectrometer, is based on classical least-squares (CLS) regression, and the second is based on partial least-squares (PLS) regression. It is shown that CLS regression only gives accurate results if the absorption features of the analytes are located in very short spectral intervals where lines due to atmospheric water vapor are absent or very weak; of the three analytes examined, only ammonia fell into this category. On the other hand, PLS regression works allowed what appeared to be accurate results to be obtained for all three analytes.
Application of automated measurement and verification to utility energy efficiency program data
Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...
2017-02-17
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less
Application of automated measurement and verification to utility energy efficiency program data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Fernandes, Samuel
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav; Rasmussen, Brian Schou; Müller, Irene Breum; Johansen, Sys Stybe; Linnet, Kristian
2013-07-01
A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and ultra-high-performance liquid chromatography (UHPLC) with TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with regard to matrix effects, extraction recovery, and process efficiency. The limit of identification ranged from 0.001 to 0.1 mg/kg, and the process efficiency exceeded 50% for 73 of the 95 analytes. As an example of application, 1335 forensic traffic cases were analyzed with the presented screening method. Of these, 992 cases (74%) were positive for one or more traffic-relevant drugs above the Danish legal limits. Commonly abused drugs such as amphetamine, cocaine, and frequent types of benzodiazepines were the major findings. Nineteen less frequently encountered drugs were detected e.g. buprenorphine, butylone, cathine, fentanyl, lysergic acid diethylamide, m-chlorophenylpiperazine, 3,4-methylenedioxypyrovalerone, mephedrone, 4-methylamphetamine, p-fluoroamphetamine, and p-methoxy-N-methylamphetamine. In conclusion, using UHPLC-TOF-MS screening with data-independent acquisition resulted in the detection of common drugs of abuse as well as new designer drugs and more rarely occurring drugs. Thus, TOF-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of δ-9-tetrahydrocannabinol, which should be handled in a separate method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Schuurman, Tim; de Boer, Richard; Patty, Rachèl; Kooistra-Smid, Mirjam; van Zwet, Anton
2007-12-01
In the present study, three methods (NucliSens miniMAG [bioMérieux], MagNA Pure DNA Isolation Kit III Bacteria/Fungi [Roche], and a silica-guanidiniumthiocyanate {Si-GuSCN-F} procedure for extracting DNA from stool specimens were compared with regard to analytical performance (relative DNA recovery and down stream real-time PCR amplification of Salmonella enterica DNA), stability of the extracted DNA, hands-on time (HOT), total processing time (TPT), and costs. The Si-GuSCN-F procedure showed the highest analytical performance (relative recovery of 99%, S. enterica real-time PCR sensitivity of 91%) at the lowest associated costs per extraction (euro 4.28). However, this method did required the longest HOT (144 min) and subsequent TPT (176 min) when processing 24 extractions. Both miniMAG and MagNA Pure extraction showed similar performances at first (relative recoveries of 57% and 52%, S. enterica real-time PCR sensitivity of 85%). However, when difference in the observed Ct values after real-time PCR were taken into account, MagNA Pure resulted in a significant increase in Ct value compared to both miniMAG and Si-GuSCN-F (with on average +1.26 and +1.43 cycles). With regard to inhibition all methods showed relatively low inhibition rates (< 4%), with miniMAG providing the lowest rate (0.7%). Extracted DNA was stable for at least 1 year for all methods. HOT was lowest for MagNA Pure (60 min) and TPT was shortest for miniMAG (121 min). Costs, finally, were euro 4.28 for Si-GuSCN, euro 6.69 for MagNA Pure and euro 9.57 for miniMAG.
NASA Astrophysics Data System (ADS)
Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.
2014-05-01
MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.
Perez, Manuel
2017-01-01
The automatic analysis of NMR data has been a much-desired endeavour for the last six decades, as it is the case with any other analytical technique. This need for automation has only grown as advances in hardware; pulse sequences and automation have opened new research areas to NMR and increased the throughput of data. Full automatic analysis is a worthy, albeit hard, challenge, but in a world of artificial intelligence, instant communication and big data, it seems that this particular fight is happening with only one technique at a time (let this be NMR, MS, IR, UV or any other), when the reality of most laboratories is that there are several types of analytical instrumentation present. Data aggregation, verification and elucidation by using complementary techniques (e.g. MS and NMR) is a desirable outcome to pursue, although a time-consuming one if performed manually; hence, the use of automation to perform the heavy lifting for users is required to make the approach attractive for scientists. Many of the decisions and workflows that could be implemented under automation will depend on the two-way communication with databases that understand analytical data, because it is desirable not only to query these databases but also to grow them in as much of an automatic manner as possible. How these databases are designed, set up and the data inside classified will determine what workflows can be implemented. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Chan, Adrian C H; Adachi, Jonathan D; Papaioannou, Alexandra; Wong, Andy Kin On
Lower peripheral quantitative computed tomography (pQCT)-derived leg muscle density has been associated with fragility fractures in postmenopausal women. Limb movement during image acquisition may result in motion streaks in muscle that could dilute this relationship. This cross-sectional study examined a subset of women from the Canadian Multicentre Osteoporosis Study. pQCT leg scans were qualitatively graded (1-5) for motion severity. Muscle and motion streak were segmented using semi-automated (watershed) and fully automated (threshold-based) methods, computing area, and density. Binary logistic regression evaluated odds ratios (ORs) for fragility or all-cause fractures related to each of these measures with covariate adjustment. Among the 223 women examined (mean age: 72.7 ± 7.1 years, body mass index: 26.30 ± 4.97 kg/m 2 ), muscle density was significantly lower after removing motion (p < 0.001) for both methods. Motion streak areas segmented using the semi-automated method correlated better with visual motion grades (rho = 0.90, p < 0.01) compared to the fully automated method (rho = 0.65, p < 0.01). Although the analysis-reanalysis precision of motion streak area segmentation using the semi-automated method is above 5% error (6.44%), motion-corrected muscle density measures remained well within 2% analytical error. The effect of motion-correction on strengthening the association between muscle density and fragility fractures was significant when motion grade was ≥3 (p interaction <0.05). This observation was most dramatic for the semi-automated algorithm (OR: 1.62 [0.82,3.17] before to 2.19 [1.05,4.59] after correction). Although muscle density showed an overall association with all-cause fractures (OR: 1.49 [1.05,2.12]), the effect of motion-correction was again, most impactful within individuals with scans showing grade 3 or above motion. Correcting for motion in pQCT leg scans strengthened the relationship between muscle density and fragility fractures, particularly in scans with motion grades of 3 or above. Motion streaks are not confounders to the relationship between pQCT-derived leg muscle density and fractures, but may introduce heterogeneity in muscle density measurements, rendering associations with fractures to be weaker. Copyright © 2016. Published by Elsevier Inc.
Echols, Kathy R.; Gale, Robert W.; Tillitt, Donald E.; Schwartz, Ted R.; O'Laughlin, Jerome
1997-01-01
The Ah (aryl-hydrocarbon) hydroxylase-receptor active polychlorinated biphenyls (PCBs), polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs) were fractionated by an automated high-performance liquid chromatography (HPLC) system using the Hypercarb™ porous graphitic carbon (PGC) column. This commercially available column was used to fractionate the di-, mono-, and non-ortho PCBs into three fractions for gas chromatography (GC)/electron capture detection analysis, and a fourth fraction containing the PCDDs/PCDFs for GC/mass spectrometry analysis. The recoveries of the PCBs ranged from 68 to 96%, and recoveries of the PCDDs/PCDFs ranged from 74 to 123%. The PGC column has the advantage of faster separations (110 min versus 446 min) and less solvent use (275 ml versus 1,100 ml) compared with automated fractionation of these compounds on activated carbon (PX-21), while still affording good separation of the classes. The PGC column may have an advantage over the pyrenyl-based HPLC method because it has a greater loading capacity (400 μg total PCBs versus 250 μg). Overall, the PGC is a standard column that provides reproducible fractionation of PCDD/PCDFs and PCBs for analytical measurement in environmental samples.
An Automated Directed Spectral Search Methodology for Small Target Detection
NASA Astrophysics Data System (ADS)
Grossman, Stanley I.
Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed search techniques of spectral image based small target detection. It offers evidence of the functionality of the NNI visualization and also provides evidence that the increased spectral dimensionality of the 8-band Worldview-2 datasets provides noteworthy improvement in results over traditional 4-band multispectral datasets. The final experiment presents the results from a prototype fully automated target detection scheme in support of the overarching premise. This work establishes the analytic sweet spot as the optimum threshold defined as the point where error detection rate curves -- false detections vs. missing detections -- cross. At this point the errors are minimized while the detection rate is maximized. It then demonstrates that taking the first moment statistic of the histogram of calculated target detection values from a detection search with test threshold set arbitrarily high will estimate the analytic sweet spot for that image. It also demonstrates that directed search techniques -- when utilized with appropriate scene-specific modeled signatures and atmospheric compensations -- perform at least as well as in-scene search techniques 88% of the time and grossly under-performing only 11% of the time; the in-scene only performs as well or better 50% of the time. It further demonstrates the clear advantage increased multispectral dimensionality brings to detection searches improving performance in 50% of the cases while performing at least as well 72% of the time. Lastly, it presents evidence that a fully automated prototype performs as anticipated laying the groundwork for further research into fully automated processes for small target detection.
Magnusson, R; Nordlander, T; Östin, A
2016-01-15
Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...
Signal Enhancement in HPLC/Micro-Coil NMR Using Automated Column Trapping
Djukovic, Danijel; Liu, Shuhui; Henry, Ian; Tobias, Brian; Raftery, Daniel
2008-01-01
A new HPLC-NMR system is described that performs analytical separation, pre-concentration, and NMR spectroscopy in rapid succession. The central component of our method is the online pre-concentration sequence that improves the match between post-column analyte peak volume and the micro-coil NMR detection volume. Separated samples are collected on to a C18 guard column with a mobile phase composed of 90% D2O/10% acetonitrile-D3, and back-flashed to the NMR micro-coil probe with 90% acetonitrile-D3/10% D2O. In order to assess the performance of our unit, we separated a standard mixture of 1 mM ibuprofen, naproxen, and phenylbutazone using a commercially available C18 analytical column. The S/N measurements from the NMR acquisitions indicated that we achieved signal enhancement factors up to 10.4 (±1.2)-fold. Furthermore, we observed that pre-concentration factors increased as the injected amount of analyte decreased. The highest concentration enrichment of 14.7 (±2.2)-fold was attained injecting 100 μL solution of 0.2 mM (~4 μg) ibuprofen. PMID:17037915
Prediction of psychosis across protocols and risk cohorts using automated language analysis
Corcoran, Cheryl M.; Carrillo, Facundo; Fernández‐Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C.; Bearden, Carrie E.; Cecchi, Guillermo A.
2018-01-01
Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer‐based natural language processing analyses, we previously showed that, among English‐speaking clinical (e.g., ultra) high‐risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross‐validate these automated linguistic analytic methods in a second larger risk cohort, also English‐speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine‐learning speech classifier – comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns – that had an 83% accuracy in predicting psychosis onset (intra‐protocol), a cross‐validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross‐protocol), and a 72% accuracy in discriminating the speech of recent‐onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at‐risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. PMID:29352548
Prediction of psychosis across protocols and risk cohorts using automated language analysis.
Corcoran, Cheryl M; Carrillo, Facundo; Fernández-Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C; Bearden, Carrie E; Cecchi, Guillermo A
2018-02-01
Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer-based natural language processing analyses, we previously showed that, among English-speaking clinical (e.g., ultra) high-risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross-validate these automated linguistic analytic methods in a second larger risk cohort, also English-speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine-learning speech classifier - comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns - that had an 83% accuracy in predicting psychosis onset (intra-protocol), a cross-validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross-protocol), and a 72% accuracy in discriminating the speech of recent-onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at-risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. © 2018 World Psychiatric Association.
3D printed fluidics with embedded analytic functionality for automated reaction optimisation
Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D
2017-01-01
Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852
Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario
2013-11-01
Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.
Ba, B B; Corniot, A G; Ducint, D; Breilh, D; Grellet, J; Saux, M C
1999-03-05
An isocratic high-performance liquid chromatographic method with automated solid-phase extraction has been developed to determine foscarnet in calf and human serums. Extraction was performed with an anion exchanger, SAX, from which the analyte was eluted with a 50 mM potassium pyrophosphate buffer, pH 8.4. The mobile phase consisted of methanol-40 mM disodium hydrogenphosphate, pH 7.6 containing 0.25 mM tetrahexylammonium hydrogensulphate (25:75, v/v). The analyte was separated on a polyether ether ketone (PEEK) column 150x4.6 mm I.D. packed with Kromasil 100 C18, 5 microm. Amperometric detection allowed a quantification limit of 15 microM. The assay was linear from 15 to 240 microM. The recovery of foscarnet from calf serum ranged from 60.65+/-1.89% for 15 microM to 67.45+/-1.24% for 200 microM. The coefficient of variation was < or = 3.73% for intra-assay precision and < or =7.24% for inter-assay precision for calf serum concentrations ranged from 15 to 800 microM. For the same samples, the deviation from the nominal value ranged from -8.97% to +5.40% for same day accuracy and from -4.50% to +2.77% for day-to-day accuracy. Selectivity was satisfactory towards potential co-medications. Replacement of human serum by calf serum for calibration standards and quality control samples was validated. Automation brought more protection against biohazards and increase in productivity for routine monitoring and pharmacokinetic studies.
Simon, L
2007-10-01
The integral transform technique was implemented to solve a mathematical model developed for percutaneous drug absorption. The model included repeated application and removal of a patch from the skin. Fick's second law of diffusion was used to study the transport of a medicinal agent through the vehicle and subsequent penetration into the stratum corneum. Eigenmodes and eigenvalues were computed and introduced into an inversion formula to estimate the delivery rate and the amount of drug in the vehicle and the skin. A dynamic programming algorithm calculated the optimal doses necessary to achieve a desired transdermal flux. The analytical method predicted profiles that were in close agreement with published numerical solutions and provided an automated strategy to perform therapeutic drug monitoring and control.
NASA Technical Reports Server (NTRS)
Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.;
2007-01-01
The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.
Pérez, D; Martínez-Flores, J A; Serrano, M; Lora, D; Paz-Artal, E; Morales, J M; Serrano, A
2016-10-01
In recent years, we have been witnessing increased clinical interest in the determination of IgA anti-beta 2-glycoprotein I (aB2GPI) antibodies as well as increased demand for this test. Some ELISA-based diagnostic systems for IgA aB2GPI antibodies detection are suboptimal to detect it. The aim of our study was to determine whether the diagnostic yield of modern detection systems based on automatic platforms to measure IgA aB2GPI is equivalent to that of the well-optimized ELISA-based assays. In total, 130 patients were analyzed for IgA aB2GPI by three fully automated immunoassays using an ELISA-based assay as reference. The three systems were also analyzed for IgG aB2GPI with 58 patients. System 1 was able to detect IgA aB2GPI with good sensitivity and kappa index (99% and 0.72, respectively). The other two systems had also poor sensitivity (20% and 15%) and kappa index (0.10 and 0.07), respectively. On the other hand, kappa index for IgG aB2GPI was >0.89 in the three systems. Some analytical methods to detect IgA aB2GPI are suboptimal as well as some ELISA-based diagnostic systems. It is important that the scientific community work to standardize analytical methods to determine IgA aB2GPI antibodies. © 2016 John Wiley & Sons Ltd.
Force-Free Magnetic Fields Calculated from Automated Tracing of Coronal Loops with AIA/SDO
NASA Astrophysics Data System (ADS)
Aschwanden, M. J.
2013-12-01
One of the most realistic magnetic field models of the solar corona is a nonlinear force-free field (NLFFF) solution. There exist about a dozen numeric codes that compute NLFFF solutions based on extrapolations of photospheric vector magnetograph data. However, since the photosphere and lower chromosphere is not force-free, a suitable correction has to be applied to the lower boundary condition. Despite of such "pre-processing" corrections, the resulting theoretical magnetic field lines deviate substantially from observed coronal loop geometries. - Here we developed an alternative method that fits an analytical NLFFF approximation to the observed geometry of coronal loops. The 2D coordinates of the geometry of coronal loop structures observed with AIA/SDO are traced with the "Oriented Coronal CUrved Loop Tracing" (OCCULT-2) code, an automated pattern recognition algorithm that has demonstrated the fidelity in loop tracing matching visual perception. A potential magnetic field solution is then derived from a line-of-sight magnetogram observed with HMI/SDO, and an analytical NLFFF approximation is then forward-fitted to the twisted geometry of coronal loops. We demonstrate the performance of this magnetic field modeling method for a number of solar active regions, before and after major flares observed with SDO. The difference of the NLFFF and the potential field energies allows us then to compute the free magnetic energy, which is an upper limit of the energy that is released during a solar flare.
Li, Shuhuai; Li, Jianping; Luo, Jinhui; Xu, Zhi; Ma, Xionghui
2018-05-11
An electrochemical microfluidic chip is described for the determination of the insecticide carbofuran. It is making use of a molecularly imprinted film (MIP) and a DNA aptamer as dual recognition units. The analyte (carbofuran) is transported to the MIP and captured at the identification site in the channel. Then, carbofuran is eluted with carbinol-acetic acid and transported to the DNA aptamer on the testing position of the chip. It is captured again, this time by the aptamer, and detected by differential pulse voltammetry (DPV). The dual recognition (by aptamer and MIP) results in outstanding selectivity. Additionally, graphene oxide-supported gold nanoparticles (GO-AuNPs) were used to improve the sensitivity of electrochemical detector. DPV response is linear in the 0.2 to 50 nM carbofuran concentration range at a potential of -1.2 V, with a 67 pM detection limit. The method has attractive features such as its potential for high throughput, high degree of automation, and high integration. Conceivably, the method may be extended to other analytes for which appropriate MIPs and aptamers are available. Graphical abstract Schematic of an electrochemical microfluidic chip for carbofuran detection based on a molecularly imprinted film (MIP) and a DNA aptamer as dual recognition units. In the chip, targets were recognized by MIP and aptamer, respectively. It shows promising potential for the design of electrochemical devices with high throughput, high automation, and high integration.
Alternatives to current flow cytometry data analysis for clinical and research studies.
Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul
2018-02-01
Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.
Fan, Sufang; Li, Qiang; Zhang, Xiaoguang; Cui, Xiaobin; Zhang, Dongsheng; Zhang, Yan
2015-05-01
A novel fully automated method based on dual column switching using turbulent flow chromatography followed by liquid chromatography with tandem mass spectrometry was developed for the determination of aflatoxin B1 , B2 , G1 , and G2 in corn powder, edible oil, peanut butter, and soy sauce samples. After ultrasound-assisted extraction, samples were directly injected to the chromatographic system and the analytes were concentrated into the clean-up loading column. Through purge switching, the analytes were transferred to the analytical column for subsequent detection by mass spectrometry. Different types of TurboFlow(TM) columns, transfer flow rate, transfer time were optimized. The limits of detection and quantification of this method ranged between 0.2-2.0 and 0.5-4.0 μg/kg for aflatoxins in different matrixes, respectively. Recoveries of aflatoxins were in range of 83-108.1% for all samples, matrix effects were in range of 34.1-104.7%. The developed method has been successfully applied in the analysis of aflatoxin B1 , B2 , G1 , and G2 in real samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Lattin, Frank G.; Paul, Donald G.
1996-11-01
A sorbent-based gas chromatographic method provides continuous quantitative measurement of phosgene, hydrogen cyanide, and cyanogen chloride in ambient air. These compounds are subject to workplace exposure limits as well as regulation under terms of the Chemical Arms Treaty and Title III of the 1990 Clean Air Act amendments. The method was developed for on-sit use in a mobile laboratory during remediation operations. Incorporated into the method are automated multi-level calibrations at time weighted average concentrations, or lower. Gaseous standards are prepared in fused silica lined air sampling canisters, then transferred to the analytical system through dynamic spiking. Precision and accuracy studies performed to validate the method are described. Also described are system deactivation and passivation techniques critical to optimum method performance.
Cordero-Vaca, María; Trujillo-Rodríguez, María J; Zhang, Cheng; Pino, Verónica; Anderson, Jared L; Afonso, Ana M
2015-06-01
Four different crosslinked polymeric ionic liquid (PIL)-based sorbent coatings were evaluated in an automated direct-immersion solid-phase microextraction method (automated DI-SPME) in combination with gas chromatography (GC). The crosslinked PIL coatings were based on vinyl-alkylimidazolium- (ViCnIm-) or vinylbenzyl-alkylimidazolium- (ViBzCnIm-) IL monomers, and di-(vinylimidazolium)dodecane ((ViIm)2C12-) or di-(vinylbenzylimidazolium)dodecane ((ViBzIm)2C12-) dicationic IL crosslinkers. In addition, a PIL-based hybrid coating containing multi-walled carbon nanotubes (MWCNTs) was also studied. The studied PIL coatings were covalently attached to derivatized nitinol wires and mounted onto the Supelco assembly to ensure automation when acting as SPME coatings. Their behavior was evaluated in the determination of a group of water pollutants, after proper optimization. A comparison was carried out with three common commercial SPME fibers. It was observed that those PILs containing a benzyl group in their structures, either in the IL monomer and crosslinker (PIL-1-1) or only in the crosslinker (PIL-0-1), were the most efficient sorbents for the selected analytes. The validation of the overall automated DI-SPME-GC-flame ionization detector (FID) method gave limits of detection down to 135 μg · L(-1) for p-cresol when using the PIL-1-1 and down to 270 μg · L(-1) when using the PIL-0-1; despite their coating thickness: ~2 and ~5 μm, respectively. Average relative recoveries with waters were of 85 ± 14 % and 87 ± 15 % for PIL-1-1 and PIL-0-1, respectively. Precision values as relative standard deviation were always lower than 4.9 and 7.6 % (spiked level between 10 and 750 μg · L(-1), as intra-day precision). Graphical Abstract Automated DI-SPME-GC-FID using crosslinked-PILs sorbent coatings for the determination of waterpollutants.
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Srinivas, Nuggehally R
2006-05-01
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.
Automated Conflict Resolution For Air Traffic Control
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
2005-01-01
The ability to detect and resolve conflicts automatically is considered to be an essential requirement for the next generation air traffic control system. While systems for automated conflict detection have been used operationally by controllers for more than 20 years, automated resolution systems have so far not reached the level of maturity required for operational deployment. Analytical models and algorithms for automated resolution have been traffic conditions to demonstrate that they can handle the complete spectrum of conflict situations encountered in actual operations. The resolution algorithm described in this paper was formulated to meet the performance requirements of the Automated Airspace Concept (AAC). The AAC, which was described in a recent paper [1], is a candidate for the next generation air traffic control system. The AAC's performance objectives are to increase safety and airspace capacity and to accommodate user preferences in flight operations to the greatest extent possible. In the AAC, resolution trajectories are generated by an automation system on the ground and sent to the aircraft autonomously via data link .The algorithm generating the trajectories must take into account the performance characteristics of the aircraft, the route structure of the airway system, and be capable of resolving all types of conflicts for properly equipped aircraft without requiring supervision and approval by a controller. Furthermore, the resolution trajectories should be compatible with the clearances, vectors and flight plan amendments that controllers customarily issue to pilots in resolving conflicts. The algorithm described herein, although formulated specifically to meet the needs of the AAC, provides a generic engine for resolving conflicts. Thus, it can be incorporated into any operational concept that requires a method for automated resolution, including concepts for autonomous air to air resolution.
We've Got Plenty of Data, Now How Can We Use It?
ERIC Educational Resources Information Center
Weiler, Jeffrey K.; Mears, Robert L.
1999-01-01
To mine a large store of school data, a new technology (variously termed data warehousing, data marts, online analytical processing, and executive information systems) is emerging. Data warehousing helps school districts extract and restructure desired data from automated systems and create new databases designed to enhance analytical and…
ERIC Educational Resources Information Center
Gerontas, Apostolos
2014-01-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent…
2015-08-18
SECURITY CLASSIFICATION OF: Arena 60 Discrete Photometric Analyzer System and ancillary instrumentation were acquired to increase our analytical...Infrastructure at West Virginia State University Report Title Arena 60 Discrete Photometric Analyzer System and ancillary instrumentation were acquired...Progress Principal Accomplishments: a. One Postdoctoral fellow was trained using the automated Arena 60 Discrete Photometric Analyzer and
NASA Technical Reports Server (NTRS)
1996-01-01
Open Sesame! is the first commercial software product that learns user's behavior, and offers automation and coaching suggestions to the user. The neural learning module looks for repetitive patterns that have not been automated; when it finds one, it creates an observation and, upon approval, automates the task. The manufacturer, Charles River Analytics, credits Langley Research Center and Johnson Space Center Small Business Innovation Research grants and the time the president and vice president spent at the two centers in the 1970s as being essential to the development of their product line.
Automation is key to managing a population's health.
Matthews, Michael B; Hodach, Richard
2012-04-01
Online tools for automating population health management can help healthcare organizations meet their patients' needs both during and between encounters with the healthcare system. These tools can facilitate: The use of registries to track patients' health status and care gaps. Outbound messaging to notify patients when they need care. Care team management of more patients at different levels of risk. Automation of workflows related to case management and transitions of care. Online educational and mobile health interventions to engage patients in their care. Analytics programs to identify opportunities for improvement.
This NASA Dryden F/A-18 is participating in the Automated Aerial Refueling (AAR) project. F/A-18 (No
NASA Technical Reports Server (NTRS)
2002-01-01
A NASA Dryden F/A-18 is participating in the Automated Aerial Refueling (AAR) project. F/A-18 (No. 847) is acting as an in-flight refueling tanker in the study to develop analytical models for an automated aerial refueling system for unmanned vehicles. A 300-gallon aerodynamic pod containing air-refueling equipment is seen beneath the fuselage. The hose and refueling basket are extended during an assessment of their dynamics on the F/A-18A.
Anthemidis, A; Kazantzi, V; Samanidou, V; Kabir, A; Furton, K G
2016-08-15
A novel flow injection-fabric disk sorptive extraction (FI-FDSE) system was developed for automated determination of trace metals. The platform was based on a minicolumn packed with sol-gel coated fabric media in the form of disks, incorporated into an on-line solid-phase extraction system, coupled with flame atomic absorption spectrometry (FAAS). This configuration provides minor backpressure, resulting in high loading flow rates and shorter analytical cycles. The potentials of this technique were demonstrated for trace lead and cadmium determination in environmental water samples. The applicability of different sol-gel coated FPSE media was investigated. The on-line formed complex of metal with ammonium pyrrolidine dithiocarbamate (APDC) was retained onto the fabric surface and methyl isobutyl ketone (MIBK) was used to elute the analytes prior to atomization. For 90s preconcentration time, enrichment factors of 140 and 38 and detection limits (3σ) of 1.8 and 0.4μgL(-1) were achieved for lead and cadmium determination, respectively, with a sampling frequency of 30h(-1). The accuracy of the proposed method was estimated by analyzing standard reference materials and spiked water samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Wu, Jingming; Ee, Kim Huey; Lee, Hian Kee
2005-08-05
Automated dynamic liquid-liquid-liquid microextraction (D-LLLME) controlled by a programmable syringe pump and combined with HPLC-UV was investigated for the extraction and determination of 5 phenoxy acid herbicides in aqueous samples. In the extraction procedure, the acceptor phase was repeatedly withdrawn into and discharged from the hollow fiber by the syringe pump. The repetitive movement of acceptor phase into and out of the hollow fiber channel facilitated the transfer of analytes into donor phase, from the organic phase held in the pore of the fiber. Parameters such as the organic solvent, concentrations of the donor and acceptor phases, plunger movement pattern, speed of agitation and ionic strength of donor phase were evaluated. Good linearity of analytes was achieved in the range of 0.5-500 ng/ml with coefficients of determination, r2 > 0.9994. Good repeatabilities of extraction performance were obtained with relative standard deviations lower than 7.5%. The method provided up-to 490-fold enrichment within 13 min. In addition, the limits of detection (LODs) ranged from 0.1 to 0.4 ng/mL (S/N = 3). D-LLLME was successfully applied for the analysis of phenoxy acid herbicides from real environmental water samples.
Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert
2015-01-01
A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.
NASA Astrophysics Data System (ADS)
Makarycheva, A. I.; Faerman, V. A.
2017-02-01
The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.
Automated flow cytometric analysis across large numbers of samples and cell types.
Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno
2015-04-01
Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.
Automated MALDI matrix deposition method with inkjet printing for imaging mass spectrometry.
Baluya, Dodge L; Garrett, Timothy J; Yost, Richard A
2007-09-01
Careful matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is critical for producing reproducible analyte ion signals. Traditional methods for matrix deposition are often considered an art rather than a science, with significant sample-to-sample variability. Here we report an automated method for matrix deposition, employing a desktop inkjet printer (<$200) with 5760 x 1440 dpi resolution and a six-channel piezoelectric head that delivers 3 pL/drop. The inkjet printer tray, designed to hold CDs and DVDs, was modified to hold microscope slides. Empty ink cartridges were filled with MALDI matrix solutions, including DHB in methanol/water (70:30) at concentrations up to 40 mg/mL. Various samples (including rat brain tissue sections and standards of small drug molecules) were prepared using three deposition methods (electrospray, airbrush, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed that matrix crystals were formed evenly across the sample. There was minimal background signal after storing the matrix in the cartridges over a 6-month period. Overall, the mass spectral images gathered from inkjet-printed tissue specimens were of better quality and more reproducible than from specimens prepared by the electrospray and airbrush methods.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
Microfluidic devices to enrich and isolate circulating tumor cells
Myung, J. H.; Hong, S.
2015-01-01
Given the potential clinical impact of circulating tumor cells (CTCs) in blood as a clinical biomarker for diagnosis and prognosis of various cancers, a myriad of detection methods for CTCs have been recently introduced. Among those, a series of microfluidic devices are particularly promising as these uniquely offer micro-scale analytical systems that are highlighted by low consumption of samples and reagents, high flexibility to accommodate other cutting-edge technologies, precise and well-defined flow behaviors, and automation capability, presenting significant advantages over the conventional larger scale systems. In this review, we highlight the advantages of microfluidic devices and their translational potential into CTC detection methods, categorized by miniaturization of bench-top analytical instruments, integration capability with nanotechnologies, and in situ or sequential analysis of captured CTCs. This review provides a comprehensive overview of recent advances in the CTC detection achieved through application of microfluidic devices and their challenges that these promising technologies must overcome to be clinically impactful. PMID:26549749
Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.
1988-01-01
A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.
A Variational Approach to the Analysis of Dissipative Electromechanical Systems
Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek
2014-01-01
We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221
Total Triiodothyronine by Fluorescence Polarization Immunoassay (FPIA),
Graves ’ disease . Traditionally, radioimmunoassays (RIA) have been employed for the determination of total T3. Enzyme immunoassays (EIA) and fluorescence immunoassays (FIA) have been developed for many of the analytes that formerly were measured using RIA. One variation of this new generation of immunoassays is fluorescence polarization. A fluorescence polarization immunoassay (FPIA) method for total T3 has been automated by adaptation to the TDx (Abbott, Chicago, IL) clinical analyzer. The TDx total T3 assay has been evaluated as a replacement for an RIA total T3
Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Rallabhandi, Sriram K.
2010-01-01
A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.
Buscombe, Daniel D.; Rubin, David M.
2012-01-01
1. In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.
NASA Astrophysics Data System (ADS)
Buscombe, D.; Rubin, D. M.
2012-06-01
In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.
Cheng, Wing-Chi; Yau, Tsan-Sang; Wong, Ming-Kei; Chan, Lai-Ping; Mok, Vincent King-Kuen
2006-10-16
A rapid urinalysis system based on SPE-LC-MS/MS with an in-house post-analysis data management system has been developed for the simultaneous identification and semi-quantitation of opiates (morphine, codeine), methadone, amphetamines (amphetamine, methylamphetamine (MA), 3,4-methylenedioxyamphetamine (MDA) and 3,4-methylenedioxymethamphetamine (MDMA)), 11-benzodiazepines or their metabolites and ketamine. The urine samples are subjected to automated solid phase extraction prior to analysis by LC-MS (Finnigan Surveyor LC connected to a Finnigan LCQ Advantage) fitted with an Alltech Rocket Platinum EPS C-18 column. With a single point calibration at the cut-off concentration for each analyte, simultaneous identification and semi-quantitation for the above mentioned drugs can be achieved in a 10 min run per urine sample. A computer macro-program package was developed to automatically retrieve appropriate data from the analytical data files, compare results with preset values (such as cut-off concentrations, MS matching scores) of each drug being analyzed and generate user-defined Excel reports to indicate all positive and negative results in batch-wise manner for ease of checking. The final analytical results are automatically copied into an Access database for report generation purposes. Through the use of automation in sample preparation, simultaneous identification and semi-quantitation by LC-MS/MS and a tailored made post-analysis data management system, this new urinalysis system significantly improves the quality of results, reduces the post-data treatment time, error due to data transfer and is suitable for high-throughput laboratory in batch-wise operation.
Cabrera, Carlos; Chang, Lei; Stone, Mars; Busch, Michael; Wilson, David H
2015-11-01
Nucleic acid testing (NAT) has become the standard for high sensitivity in detecting low levels of virus. However, adoption of NAT can be cost prohibitive in low-resource settings where access to extreme sensitivity could be clinically advantageous for early detection of infection. We report development and preliminary validation of a simple, low-cost, fully automated digital p24 antigen immunoassay with the sensitivity of quantitative NAT viral load (NAT-VL) methods for detection of acute HIV infection. We developed an investigational 69-min immunoassay for p24 capsid protein for use on a novel digital analyzer on the basis of single-molecule-array technology. We evaluated the assay for sensitivity by dilution of standardized preparations of p24, cultured HIV, and preseroconversion samples. We characterized analytical performance and concordance with 2 NAT-VL methods and 2 contemporary p24 Ag/Ab combination immunoassays with dilutions of viral isolates and samples from the earliest stages of HIV infection. Analytical sensitivity was 0.0025 ng/L p24, equivalent to 60 HIV RNA copies/mL. The limit of quantification was 0.0076 ng/L, and imprecision across 10 runs was <10% for samples as low as 0.09 ng/L. Clinical specificity was 95.1%. Sensitivity concordance vs NAT-VL on dilutions of preseroconversion samples and Group M viral isolates was 100%. The digital immunoassay exhibited >4000-fold greater sensitivity than contemporary immunoassays for p24 and sensitivity equivalent to that of NAT methods for early detection of HIV. The data indicate that NAT-level sensitivity for acute HIV infection is possible with a simple, low-cost digital immunoassay. © 2015 American Association for Clinical Chemistry.
NASA Astrophysics Data System (ADS)
Clark, A. E.; Yoon, S.; Sheesley, R. J.; Usenko, S.
2014-12-01
DISCOVER-AQ is a NASA mission seeking to better understand air quality in cities across the United States. In September 2013, flight, satellite and ground-based data was collected in Houston, TX and the surrounding metropolitan area. Over 300 particulate matter filter samples were collected as part of the ground-based sampling efforts, at four sites across Houston. Samples include total suspended particle matter (TSP) and fine particulate matter (less than 2.5 μm in aerodynamic diameter; PM2.5). For this project, an analytical method has been developed for the pressurized liquid extraction (PLE) of a wide variety of organic tracers and contaminants from quartz fiber filters (QFFs). Over 100 compounds were selected including polycyclic aromatic hydrocarbons (PAHs), hopanes, levoglucosan, organochlorine pesticides, polychlorinated biphenyls (PCBs), polybrominated diphenyl ethers (PBDEs), and organophosphate flame retardants (OPFRs). Currently, there is no analytical method validated for the reproducible extraction of all seven compound classes in a single automated technique. Prior to extraction, QFF samples were spiked with known amounts of target analyte standards and isotopically-labeled surrogate standards. The QFF were then extracted with methylene chloride:acetone at high temperatures (100˚C) and pressures (1500 psi) using a Thermo Dionex Accelerated Solvent Extractor system (ASE 350). Extracts were concentrated, spiked with known amounts of isotopically-labeled internal standards, and analyzed by gas chromatography coupled with mass spectrometry utilizing electron ionization and electron capture negative ionization. Target analytes were surrogate recovery-corrected to account for analyte loss during sample preparation. Ambient concentrations of over 100 organic tracers and contaminants will be presented for four sites in Houston during DISCOVER-AQ.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Automated Trait Scores for "GRE"® Writing Tasks. Research Report. ETS RR-15-15
ERIC Educational Resources Information Center
Attali, Yigal; Sinharay, Sandip
2015-01-01
The "e-rater"® automated essay scoring system is used operationally in the scoring of the argument and issue tasks that form the Analytical Writing measure of the "GRE"® General Test. For each of these tasks, this study explored the value added of reporting 4 trait scores for each of these 2 tasks over the total e-rater score.…
Dialogue as Data in Learning Analytics for Productive Educational Dialogue
ERIC Educational Resources Information Center
Knight, Simon; Littleton, Karen
2015-01-01
This paper provides a novel, conceptually driven stance on the state of the contemporary analytic challenges faced in the treatment of dialogue as a form of data across on- and offline sites of learning. In prior research, preliminary steps have been taken to detect occurrences of such dialogue using automated analysis techniques. Such advances…
From thermometric to spectrophotometric kinetic-catalytic methods of analysis. A review.
Cerdà, Víctor; González, Alba; Danchana, Kaewta
2017-05-15
Kinetic-catalytic analytical methods have proved to be very easy and highly sensitive strategies for chemical analysis, that rely on simple instrumentation [1,2]. Molecular absorption spectrophotometry is commonly used as the detection technique. However, other detection systems, like electrochemical or thermometric ones, offer some interesting possibilities since they are not affected by the color or turbidity of the samples. In this review some initial experience with thermometric kinetic-catalytic methods is described, up to our current experience exploiting spectrophotometric flow techniques to automate this kind of reactions, including the use of integrated chips. Procedures for determination of inorganic and organic species in organic and inorganic matrices are presented. Copyright © 2017 Elsevier B.V. All rights reserved.
Green aspects, developments and perspectives of liquid phase microextraction techniques.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2014-02-01
Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.
Visualizing statistical significance of disease clusters using cartograms.
Kronenfeld, Barry J; Wong, David W S
2017-05-15
Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.
Kema, I P; Meijer, W G; Meiborg, G; Ooms, B; Willemse, P H; de Vries, E G
2001-10-01
Profiling of the plasma indoles tryptophan, 5-hydroxytryptophan (5-HTP), serotonin, and 5-hydroxyindoleacetic acid (5-HIAA) is useful in the diagnosis and follow-up of patients with carcinoid tumors. We describe an automated method for the profiling of these indoles in protein-containing matrices as well as the plasma indole concentrations in healthy controls and patients with carcinoid tumors. Plasma, cerebrospinal fluid, and tissue homogenates were prepurified by automated on-line solid-phase extraction (SPE) in Hysphere Resin SH SPE cartridges containing strong hydrophobic polystyrene resin. Analytes were eluted from the SPE cartridge by column switching. Subsequent separation and detection were performed by reversed-phase HPLC combined with fluorometric detection in a total cycle time of 20 min. We obtained samples from 14 healthy controls and 17 patients with metastasized midgut carcinoid tumors for plasma indole analysis. In the patient group, urinary excretion of 5-HIAA and serotonin was compared with concentrations of plasma indoles. Within- and between-series CVs for indoles in platelet-rich plasma were 0.6-6.2% and 3.7-12%, respectively. Results for platelet-rich plasma serotonin compared favorably with those obtained by single-component analysis. Plasma 5-HIAA, but not 5-HTP was detectable in 8 of 17 patients with carcinoid tumors. In the patient group, platelet-rich plasma total tryptophan correlated negatively with platelet-rich plasma serotonin (P = 0.021; r = -0.56), urinary 5-HIAA (P = 0.003; r = -0.68), and urinary serotonin (P <0.0001; r = -0.80). The present chromatographic approach reduces analytical variation and time needed for analysis and gives more detailed information about metabolic deviations in indole metabolism than do manual, single-component analyses.
Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.
Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S
2014-07-01
This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.
Stability of ricinine, abrine, and alpha-amanitin in finished tap ...
Journal Article Ricinine and abrine are potential indicators of drinking water contamination by the biotoxins ricin and abrin, respectively. Simultaneous detection of ricinine and abrine, along with α-amanitin, another potential biotoxin water contaminant, is reportable through the use of automated sample preparation via solid phase extraction and detection using liquid chromatography/tandem-mass spectrometry. Performance of the method was characterized over eight analytical batches with quality control samples analyzed over 10 days. For solutions of analytes prepared with appropriate preservatives, the minimum reporting level (MRL) was 0.50 μg/L for ricinine and abrine and 2.0 μg/L for α-amanitin. Among the analytes, the accuracy of the analysis ranged between 93 and 100% at concentrations of 1-2.5 x the MRL, with analytical precision ranging from 4 to 8%. Five drinking waters representing a range of water quality parameters and disinfection practices were fortified with the analytes and analyzed over a 28 day period to determine their storage stability in these waters. Ricinine was observed to be stable for 28 days in all tap waters. The analytical signal decreased within 5 hrs of sample preparation for abrine and μ-amanitin in some waters, but afterwards, remained stable for 28 days. The magnitude of the decrease correlated with common water quality parameters potentially related to sorption of contaminants onto dissolved and colloidal components within
Integrating laboratory robots with analytical instruments--must it really be so difficult?
Kramer, G W
1990-09-01
Creating a reliable system from discrete laboratory instruments is often a task fraught with difficulties. While many modern analytical instruments are marvels of detection and data handling, attempts to create automated analytical systems incorporating such instruments are often frustrated by their human-oriented control structures and their egocentricity. The laboratory robot, while fully susceptible to these problems, extends such compatibility issues to the physical dimensions involving sample interchange, manipulation, and event timing. The workcell concept was conceived to describe the procedure and equipment necessary to carry out a single task during sample preparation. This notion can be extended to organize all operations in an automated system. Each workcell, no matter how complex its local repertoire of functions, must be minimally capable of accepting information (commands, data), returning information on demand (status, results), and being started, stopped, and reset by a higher level device. Even the system controller should have a mode where it can be directed by instructions from a higher level.
Hematology of healthy Florida manatees (Trichechus manatus)
Harvey, J.W.; Harr, K.E.; Murphy, D.; Walsh, M.T.; Nolan, E.C.; Bonde, R.K.; Pate, M.G.; Deutsch, C.J.; Edwards, H.H.; Clapp, W.L.
2009-01-01
Background: Hematologic analysis is an important tool in evaluating the general health status of free-ranging manatees and in the diagnosis and monitoring of rehabilitating animals. Objectives: The purpose of this study was to evaluate diagnostically important hematologic analytes in healthy manatees (Trichechus manatus) and to assess variations with respect to location (free ranging vs captive), age class (small calves, large calves, subadults, and adults), and gender. Methods: Blood was collected from 55 free-ranging and 63 captive healthy manatees. Most analytes were measured using a CELL-DYN 3500R; automated reticulocytes were measured with an ADVIA 120. Standard manual methods were used for differential leukocyte counts, reticulocyte and Heinz body counts, and plasma protein and fibrinogen concentrations. Results: Rouleaux, slight polychromasia, stomatocytosis, and low numbers of schistocytes and nucleated RBCs (NRBCs) were seen often in stained blood films. Manual reticulocyte counts were higher than automated reticulocyte counts. Heinz bodies were present in erythrocytes of most manatees. Compared with free-ranging manatees, captive animals had slightly lower MCV, MCH, and eosinophil counts and slightly higher heterophil and NRBC counts, and fibrinogen concentration. Total leukocyte, heterophil, and monocyte counts tended to be lower in adults than in younger animals. Small calves tended to have higher reticulocyte counts and NRBC counts than older animals. Conclusions: Hematologic findings were generally similar between captive and free-ranging manatees. Higher manual reticulocyte counts suggest the ADVIA detects only reticulocytes containing large amounts of RNA. Higher reticulocyte and NRBC counts in young calves probably reflect an increased rate of erythropoiesis compared with older animals. ?? 2009 American Society for Veterinary Clinical Pathology.
Stolker, Alida A. M.; Peters, Ruud J. B.; Zuiderent, Richard; DiBussolo, Joseph M.
2010-01-01
There is an increasing interest in screening methods for quick and sensitive analysis of various classes of veterinary drugs with limited sample pre-treatment. Turbulent flow chromatography in combination with tandem mass spectrometry has been applied for the first time as an efficient screening method in routine analysis of milk samples. Eight veterinary drugs, belonging to seven different classes were selected for this study. After developing and optimising the method, parameters such as linearity, repeatability, matrix effects and carry-over were studied. The screening method was then tested in the routine analysis of 12 raw milk samples. Even without internal standards, the linearity of the method was found to be good in the concentration range of 50 to 500 µg/L. Regarding repeatability, RSDs below 12% were obtained for all analytes, with only a few exceptions. The limits of detection were between 0.1 and 5.2 µg/L, far below the maximum residue levels for milk set by the EU regulations. While matrix effects—ion suppression or enhancement—are obtained for all the analytes the method has proved to be useful for screening purposes because of its sensitivity, linearity and repeatability. Furthermore, when performing the routine analysis of the raw milk samples, no false positive or negative results were obtained. PMID:20379812
Analytical Ultrasonics in Materials Research and Testing
NASA Technical Reports Server (NTRS)
Vary, A.
1986-01-01
Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.
NASA Technical Reports Server (NTRS)
Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok
2007-01-01
This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.
ERIC Educational Resources Information Center
Field, Christopher Ryan
2009-01-01
Developments in analytical chemistry were made using acoustically levitated small volumes of liquid to study enzyme reaction kinetics and by detecting volatile organic compounds in the gas phase using single-walled carbon nanotubes. Experience gained in engineering, electronics, automation, and software development from the design and…
Information Tailoring Enhancements for Large-Scale Social Data
2015-12-15
Linked Accounts: In addition to linking Twitter accounts, users can now link their Instagram accounts. This is encouraged because users can use their...privileges regarding Twitter data collection is shown here. Instagram Limits: All limits and privileges regarding Instagram data collection is shown...here. Page | 5 Intelligent Automation Incorporated Analytics Limits: The availability of analytics for Twitter and Instagram is shown
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.; ...
2017-03-22
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S
2015-10-01
The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang
2013-07-25
Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert
2015-01-01
A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1–10 μM and 2–100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories. PMID:25670899
Evaluation of tamoxifen and metabolites by LC-MS/MS and HPLC methods.
Heath, D D; Flat, S W; Wu, A H B; Pruitt, M A; Rock, C L
2014-01-01
Epidemiological and laboratory evidence suggests that quantification of serum or plasma levels of tamoxifen and its metabolites, 4-hydroxy-N-desmethyl-tamoxifen (endoxifen), Z-4-hydroxytamoxifen (4HT), N-desmethyl-tamoxifen (ND-tam), is a clinically useful tool in the assessment and monitoring of breast cancer status in patients taking adjuvant tamoxifen. A liquid chromatographic mass spectrometric method (LC-MS/MS) was used to measure the blood levels of tamoxifen and its metabolites. This fully automated analytical method is specific, accurate and sensitive. The LC-MS/MS automated technique has now become a widely accepted reference method. This study analysed a randomly selected batch of blood samples from participants enrolled in a breast cancer study to compare results from this reference method in 40 samples with those obtained from a recently developed high-performance liquid chromatography (HPLC) method with fluorescence detection. The mean (SD) concentrations for the LC-MS/MS method (endoxifen 12.6 [7.5] ng/mL, tamoxifen 105 [44] ng/mL, 4-HT 1.9 [1.0] ng/mL, ND-tam 181 [69] ng/mL) and the HPLC method (endoxifen 13.1 [7.8] ng/mL, tamoxifen 108 [55] ng/mL, 4-HT 1.8 [0.8] ng/mL, ND-tam 184 [81] ng/mL) did not show any significant differences. The results confirm that the HPLC method offers an accurate and comparable alternative for the quantification of tamoxifen and tamoxifen metabolites.
Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun
2017-11-01
This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel
Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that varymore » as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.« less
Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin
2018-02-01
Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.
Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona
2018-05-01
The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.
Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G
2000-07-01
Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura
2017-01-01
A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
Robandt, P V; Klette, K L; Sibum, M
2009-10-01
An automated solid-phase extraction coupled with liquid chromatography and tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in human urine specimens was developed. The method was linear (R(2) = 0.9986) to 1000 ng/mL with no carryover evidenced at 2000 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision was evaluated at the 15 ng/mL level over nine batches spanning 15 days (n = 45). The coefficient of variation (%CV) was found to be 5.5% over the course of the validation. Intrarun precision of a 15 ng/mL control (n = 5) ranged from 0.58% CV to 7.4% CV for the same set of analytical batches. Interference was tested using (+/-)-11-hydroxy-Delta(9)-tetrahydrocannabinol, cannabidiol, (-)-Delta(8)-tetrahydrocannabinol, and cannabinol. One hundred and nineteen specimens previously found to contain THC-COOH by a previously validated gas chromatographic mass spectrometry (GC-MS) procedure were compared to the SPE-LC-MS-MS method. Excellent agreement was found (R(2) = 0.9925) for the parallel comparison study. The automated SPE procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. Additionally, method runtime is greatly reduced (e.g., during parallel studies the SPE-LC-MS-MS instrument was often finished with analysis by the time the technician finished the offline SPE and derivatization procedure prior to the GC-MS analysis).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Taylor, Cody
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less
Evaluation of Tamoxifen and metabolites by LC-MS/MS and HPLC Methods
Heath, D.D.; Flatt, S.W.; Wu, A.H.B.; Pruitt, M.A.; Rock, C.L.
2015-01-01
Epidemiological and laboratory evidence suggests that quantification of serum or plasma levels of tamoxifen and the metabolites of tamoxifen, 4-hydroxy-N-desmethyl-tamoxifen (endoxifen), Z-4-hydroxy-tamoxifen (4HT), N-desmethyl-tamoxifen (ND-tam) is a clinically useful tool in the assessment and monitoring of breast cancer status in patients taking adjuvant tamoxifen. A liquid chromatographic mass spectrometric method (LC-MS/MS) was used to measure the blood levels of tamoxifen and the metabolites of tamoxifen. This fully automated analytical method is specific, accurate and sensitive. The LC-MS/MS automated technique has now become a widely accepted reference method. We analyzed a randomly selected batch of blood samples from participants enrolled in a breast cancer study to compare results from this reference method in 40 samples with those obtained from a recently developed high performance liquid chromatography (HPLC) method with fluorescence detection. The mean (SD) concentration for the LC-MS/MS (endoxifen 12.6 [7.5] ng/mL, tamoxifen 105 [44] ng/mL, 4-HT 1.9 [1.0] ng/mL, ND-tam 181 [69] ng/mL) and the HPLC (endoxifen 13.1 [7.8] ng/mL, tamoxifen 108[55]ng/mL, 4-HT 1.8 [0.8] ng/mL, ND-tam 184 [81] ng/mL), the methods did not show any significant differences. Our results confirm that the HPLC method offers an accurate and comparable alternative for the quantification of tamoxifen and tamoxifen metabolites. PMID:24693573
Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu
2008-04-25
The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination identification in UPW production lines.
Oxygen Measurements in Liposome Encapsulated Hemoglobin
NASA Astrophysics Data System (ADS)
Phiri, Joshua Benjamin
Liposome encapsulated hemoglobins (LEH's) are of current interest as blood substitutes. An analytical methodology for rapid non-invasive measurements of oxygen in artificial oxygen carriers is examined. High resolution optical absorption spectra are calculated by means of a one dimensional diffusion approximation. The encapsulated hemoglobin is prepared from fresh defibrinated bovine blood. Liposomes are prepared from hydrogenated soy phosphatidylcholine (HSPC), cholesterol and dicetylphosphate using a bath sonication method. An integrating sphere spectrophotometer is employed for diffuse optics measurements. Data is collected using an automated data acquisition system employing lock-in -amplifiers. The concentrations of hemoglobin derivatives are evaluated from the corresponding extinction coefficients using a numerical technique of singular value decomposition, and verification of the results is done using Monte Carlo simulations. In situ measurements are required for the determination of hemoglobin derivatives because most encapsulation methods invariably lead to the formation of methemoglobin, a nonfunctional form of hemoglobin. The methods employed in this work lead to high resolution absorption spectra of oxyhemoglobin and other derivatives in red blood cells and liposome encapsulated hemoglobin (LEH). The analysis using singular value decomposition method offers a quantitative means of calculating the fractions of oxyhemoglobin and other hemoglobin derivatives in LEH samples. The analytical methods developed in this work will become even more useful when production of LEH as a blood substitute is scaled up to large volumes.
Takeuchi, Masaki; Tsunoda, Hiromichi; Tanaka, Hideji; Shiramizu, Yoshimi
2011-01-01
This paper describes the performance of our automated acidic (CH(3)COOH, HCOOH, HCl, HNO(2), SO(2), and HNO(3)) gases monitor utilizing a parallel-plate wet denuder (PPWD). The PPWD quantitatively collects gaseous contaminants at a high sample flow rate (∼8 dm(3) min(-1)) compared to the conventional methods used in a clean room. Rapid response to any variability in the sample concentration enables near-real-time monitoring. In the developed monitor, the analyte collected with the PPWD is pumped into one of two preconcentration columns for 15 min, and determined by means of ion chromatography. While one preconcentration column is used for chromatographic separation, the other is used for loading the sample solution. The system allows continuous monitoring of the common acidic gases in an advanced semiconductor manufacturing clean room. 2011 © The Japan Society for Analytical Chemistry
Progress Towards an Open Data Ecosystem for Australian Geochemistry and Geochronology Data
NASA Astrophysics Data System (ADS)
McInnes, B.; Rawling, T.; Brown, W.; Liffers, M.; Wyborn, L. A.; Brown, A.; Cox, S. J. D.
2016-12-01
Technological improvements in laboratory automation and microanalytical methods are producing an unprecedented volume of high-value geochemical data for use by geoscientists in understanding geological and planetary processes. In contrast, the research infrastructure necessary to systematically manage, deliver and archive analytical data has not progressed much beyond the minimum effort necessary to produce a peer-reviewed publication. Anecdotal evidence indicates that the majority of publically funded data is underreported, and what is published is relatively undiscoverable to experienced researchers let alone the general public. Government-funded "open data" initiatives have a role to play in the development of networks of data management and delivery ecosystems and practices allowing access to publically funded data. This paper reports on progress in Australia towards creation of an open data ecosystem involving multiple academic and government research institutions cooperating to create an open data architecture linking researchers, physical samples, sample metadata, laboratory metadata, analytical data and consumers.
Faraji, Hakim; Helalizadeh, Masoumeh; Kordi, Mohammad Reza
2018-01-01
A rapid, simple, and sensitive approach to the analysis of trihalomethanes (THMs) in swimming pool water samples has been developed. The main goal of this study was to overcome or to improve the shortcomings of conventional dispersive liquid-liquid microextraction (DLLME) and to maximize the realization of green analytical chemistry principles. The method involves a simple vortex-assisted microextraction step, in the absence of the dispersive solvent, followed by salting-out effect for the elimination of the centrifugation step. A bell-shaped device and a solidifiable solvent were used to simplify the extraction solvent collection after phase separation. Optimization of the independent variables was performed by using chemometric methods in three steps. The method was statistically validated based on authentic guidance documents. The completion time for extraction was less than 8 min, and the limits of detection were in the range between 4 and 72 ng L -1 . Using this method, good linearity and precision were achieved. The results of THMs determination in different real samples showed that in some cases the concentration of total THMs was more than threshold values of THMs determined by accredited healthcare organizations. This method indicated satisfactory analytical figures of merit. Graphical Abstract A novel green microextraction technique for overcoming the challenges of conventional DLLME. The proposed procedure complies with the principles of green/sustainable analytical chemistry, comprising decreasing the sample size, making easy automation of the process, reducing organic waste, diminishing energy consumption, replacing toxic reagents with safer reagents, and enhancing operator safety.
Idaho National Laboratory
2017-12-09
Automated portable device that concentrates and packages a sample of suspected contaminated water for safe, efficient transport to a qualified analytical laboratory. This technology will help safeguard against pathogen contamination or chemical and biolog
Automated measurement of respiratory gas exchange by an inert gas dilution technique
NASA Technical Reports Server (NTRS)
Sawin, C. F.; Rummel, J. A.; Michel, E. L.
1974-01-01
A respiratory gas analyzer (RGA) has been developed wherein a mass spectrometer is the sole transducer required for measurement of respiratory gas exchange. The mass spectrometer maintains all signals in absolute phase relationships, precluding the need to synchronize flow and gas composition as required in other systems. The RGA system was evaluated by comparison with the Douglas bag technique. The RGA system established the feasibility of the inert gas dilution method for measuring breath-by-breath respiratory gas exchange. This breath-by-breath analytical capability permits detailed study of transient respiratory responses to exercise.
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
Automation of POST Cases via External Optimizer and "Artificial p2" Calculation
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.; Michelson, Diane K.
2017-01-01
During conceptual design speed and accuracy are often at odds. Specifically in the realm of launch vehicles, optimizing the ascent trajectory requires a larger pool of analytical power and expertise. Experienced analysts working on familiar vehicles can produce optimal trajectories in a short time frame, however whenever either "experienced" or "familiar " is not applicable the optimization process can become quite lengthy. In order to construct a vehicle agnostic method an established global optimization algorithm is needed. In this work the authors develop an "artificial" error term to map arbitrary control vectors to non-zero error by which a global method can operate. Two global methods are compared alongside Design of Experiments and random sampling and are shown to produce comparable results to analysis done by a human expert.
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
Applying Nyquist's method for stability determination to solar wind observations
NASA Astrophysics Data System (ADS)
Klein, Kristopher G.; Kasper, Justin C.; Korreck, K. E.; Stevens, Michael L.
2017-10-01
The role instabilities play in governing the evolution of solar and astrophysical plasmas is a matter of considerable scientific interest. The large number of sources of free energy accessible to such nearly collisionless plasmas makes general modeling of unstable behavior, accounting for the temperatures, densities, anisotropies, and relative drifts of a large number of populations, analytically difficult. We therefore seek a general method of stability determination that may be automated for future analysis of solar wind observations. This work describes an efficient application of the Nyquist instability method to the Vlasov dispersion relation appropriate for hot, collisionless, magnetized plasmas, including the solar wind. The algorithm recovers the familiar proton temperature anisotropy instabilities, as well as instabilities that had been previously identified using fits extracted from in situ observations in Gary et al. (2016). Future proposed applications of this method are discussed.
Automated workflows for modelling chemical fate, kinetics and toxicity.
Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P
2017-12-01
Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Managing laboratory automation in a changing pharmaceutical industry
Rutherford, Michael L.
1995-01-01
The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014
Bhatta, R S; Kumar, D; Chhonker, Y S; Jain, G K
2011-09-01
A sensitive and selective liquid chromatography/tandem mass spectrometric method was developed for simultaneous determination of E- and Z-guggulsterone isomers (antihyperlipidemic drug) in rabbit plasma. Both the isomers were resolved on a Symmetry-Shield C(18) (5 µm, 4.6 × 150 mm) column, using gradient elution comprising a mobile phase of methanol, 0.5% v/v formic acid and acetonitrile. With dexamethasone as internal standard, plasma samples were extracted by an automated solid-phase extraction method using C(18) cartridges. Detection was performed by electrospray ionization in multiple reaction monitoring (MRM) in positive mode. The calibration curve was linear over the concentration range of 1.56-200 ng/mL (r(2) ≥ 0.998) for both analytes. The intra-day and inter-day accuracy and precision were within -0.96 to 4.12 (%bias) and 2.73 to 8.00 (%RSD) respectively. The analytes were stable after three freeze-thaw cycles. The method was successfully applied to study steriospecific pharmacokinetics of E- and Z-guggulsterone in NZ rabbit. Copyright © 2011 John Wiley & Sons, Ltd.
Fully 3D-Printed Preconcentrator for Selective Extraction of Trace Elements in Seawater.
Su, Cheng-Kuan; Peng, Pei-Jin; Sun, Yuh-Chang
2015-07-07
In this study, we used a stereolithographic 3D printing technique and polyacrylate polymers to manufacture a solid phase extraction preconcentrator for the selective extraction of trace elements and the removal of unwanted salt matrices, enabling accurate and rapid analyses of trace elements in seawater samples when combined with a quadrupole-based inductively coupled plasma mass spectrometer. To maximize the extraction efficiency, we evaluated the effect of filling the extraction channel with ordered cuboids to improve liquid mixing. Upon automation of the system and optimization of the method, the device allowed highly sensitive and interference-free determination of Mn, Ni, Zn, Cu, Cd, and Pb, with detection limits comparable with those of most conventional methods. The system's analytical reliability was further confirmed through analyses of reference materials and spike analyses of real seawater samples. This study suggests that 3D printing can be a powerful tool for building multilayer fluidic manipulation devices, simplifying the construction of complex experimental components, and facilitating the operation of sophisticated analytical procedures for most sample pretreatment applications.
Lafrenière, Nelson M; Mudrik, Jared M; Ng, Alphonsus H C; Seale, Brendon; Spooner, Neil; Wheeler, Aaron R
2015-04-07
There is great interest in the development of integrated tools allowing for miniaturized sample processing, including solid phase extraction (SPE). We introduce a new format for microfluidic SPE relying on C18-functionalized magnetic beads that can be manipulated in droplets in a digital microfluidic platform. This format provides the opportunity to tune the amount (and potentially the type) of stationary phase on-the-fly, and allows the removal of beads after the extraction (to enable other operations in same device-space), maintaining device reconfigurability. Using the new method, we employed a design of experiments (DOE) operation to enable automated on-chip optimization of elution solvent composition for reversed phase SPE of a model system. Further, conditions were selected to enable on-chip fractionation of multiple analytes. Finally, the method was demonstrated to be useful for online cleanup of extracts from dried blood spot (DBS) samples. We anticipate this combination of features will prove useful for separating a wide range of analytes, from small molecules to peptides, from complex matrices.
Surface acoustic wave nebulization facilitating lipid mass spectrometric analysis.
Yoon, Sung Hwan; Huang, Yue; Edgar, J Scott; Ting, Ying S; Heron, Scott R; Kao, Yuchieh; Li, Yanyan; Masselon, Christophe D; Ernst, Robert K; Goodlett, David R
2012-08-07
Surface acoustic wave nebulization (SAWN) is a novel method to transfer nonvolatile analytes directly from the aqueous phase to the gas phase for mass spectrometric analysis. The lower ion energetics of SAWN and its planar nature make it appealing for analytically challenging lipid samples. This challenge is a result of their amphipathic nature, labile nature, and tendency to form aggregates, which readily precipitate clogging capillaries used for electrospray ionization (ESI). Here, we report the use of SAWN to characterize the complex glycolipid, lipid A, which serves as the membrane anchor component of lipopolysaccharide (LPS) and has a pronounced tendency to clog nano-ESI capillaries. We also show that unlike ESI SAWN is capable of ionizing labile phospholipids without fragmentation. Lastly, we compare the ease of use of SAWN to the more conventional infusion-based ESI methods and demonstrate the ability to generate higher order tandem mass spectral data of lipid A for automated structure assignment using our previously reported hierarchical tandem mass spectrometry (HiTMS) algorithm. The ease of generating SAWN-MS(n) data combined with HiTMS interpretation offers the potential for high throughput lipid A structure analysis.
Presidential Green Chemistry Challenge: 2009 Greener Reaction Conditions Award
Presidential Green Chemistry Challenge 2009 award winner, CEM Corporation, developed a fast, automated analytical process using less toxic reagents and less energy to distinguish protein from the food adulterant, melamine.
Loss Factor Estimation Using the Impulse Response Decay Method on a Stiffened Structure
NASA Technical Reports Server (NTRS)
Cabell, Randolph; Schiller, Noah; Allen, Albert; Moeller, Mark
2009-01-01
High-frequency vibroacoustic modeling is typically performed using energy-based techniques such as Statistical Energy Analysis (SEA). Energy models require an estimate of the internal damping loss factor. Unfortunately, the loss factor is difficult to estimate analytically, and experimental methods such as the power injection method can require extensive measurements over the structure of interest. This paper discusses the implications of estimating damping loss factors using the impulse response decay method (IRDM) from a limited set of response measurements. An automated procedure for implementing IRDM is described and then evaluated using data from a finite element model of a stiffened, curved panel. Estimated loss factors are compared with loss factors computed using a power injection method and a manual curve fit. The paper discusses the sensitivity of the IRDM loss factor estimates to damping of connected subsystems and the number and location of points in the measurement ensemble.
Wang, Yang; Wang, Lu; Tian, Tian; Hu, Xiaoya; Yang, Chun; Xu, Qin
2012-05-21
In this study, an automated sequential injection lab-on-valve (SI-LOV) system was designed for the on-line matrix removal and preconcentration of quercetin. Octadecyl functionalized magnetic silica nanoparticles were prepared and packed into the microcolumn of the LOV as adsorbents. After being adsorbed through hydrophobic interaction, the analyte was eluted and subsequently introduced into the electrochemical flow cell by voltammetric quantification. The main parameters affecting the performance of solid-phase extraction, such as sample pH and flow rate, eluent solution and volume, accumulation potential and accumulation time were investigated in detail. Under the optimum experimental conditions, a linear calibration curve was obtained in the range of 1.0 × 10(-8) to 1 × 10(-5) mol L(-1) with R(2) = 0.9979. The limit of detection (LOD) and limit of quantitation (LOQ) were 1.3 × 10(-9) and 4.3 × 10(-9) mol L(-1), respectively. The relative standard deviation (RSD) for the determination of 1.0 × 10(-6) mol L(-1) quercetin was found to be 2.9% (n = 11) along with a sampling frequency of 40 h(-1). The applicability and reliability of the automated method described here had been applied to the determination of quercetin in human urine and red wine samples through recovery experiments, and the obtained results were in good agreement with those obtained by the HPLC method.
A novel method for automated grid generation of ice shapes for local-flow analysis
NASA Astrophysics Data System (ADS)
Ogretim, Egemen; Huebsch, Wade W.
2004-02-01
Modelling a complex geometry, such as ice roughness, plays a key role for the computational flow analysis over rough surfaces. This paper presents two enhancement ideas in modelling roughness geometry for local flow analysis over an aerodynamic surface. The first enhancement is use of the leading-edge region of an airfoil as a perturbation to the parabola surface. The reasons for using a parabola as the base geometry are: it resembles the airfoil leading edge in the vicinity of its apex and it allows the use of a lower apparent Reynolds number. The second enhancement makes use of the Fourier analysis for modelling complex ice roughness on the leading edge of airfoils. This method of modelling provides an analytical expression, which describes the roughness geometry and the corresponding derivatives. The factors affecting the performance of the Fourier analysis were also investigated. It was shown that the number of sine-cosine terms and the number of control points are of importance. Finally, these enhancements are incorporated into an automated grid generation method over the airfoil ice accretion surface. The validations for both enhancements demonstrate that they can improve the current capability of grid generation and computational flow field analysis around airfoils with ice roughness.
Oxygen isotope corrections for online δ34S analysis
Fry, B.; Silva, S.R.; Kendall, C.; Anderson, R.K.
2002-01-01
Elemental analyzers have been successfully coupled to stable-isotope-ratio mass spectrometers for online measurements of the δ34S isotopic composition of plants, animals and soils. We found that the online technology for automated δ34S isotopic determinations did not yield reproducible oxygen isotopic compositions in the SO2 produced, and as a result calculated δ34S values were often 1–3‰ too high versus their correct values, particularly for plant and animal samples with high C/S ratio. Here we provide empirical and analytical methods for correcting the S isotope values for oxygen isotope variations, and further detail a new SO2-SiO2 buffering method that minimizes detrimental oxygen isotope variations in SO2.
An automated microreactor for semi-continuous biosensor measurements.
Buffi, Nina; Beggah, Siham; Truffer, Frederic; Geiser, Martial; van Lintel, Harald; Renaud, Philippe; van der Meer, Jan Roelof
2016-04-21
Living bacteria or yeast cells are frequently used as bioreporters for the detection of specific chemical analytes or conditions of sample toxicity. In particular, bacteria or yeast equipped with synthetic gene circuitry that allows the production of a reliable non-cognate signal (e.g., fluorescent protein or bioluminescence) in response to a defined target make robust and flexible analytical platforms. We report here how bacterial cells expressing a fluorescence reporter ("bactosensors"), which are mostly used for batch sample analysis, can be deployed for automated semi-continuous target analysis in a single concise biochip. Escherichia coli-based bactosensor cells were continuously grown in a 13 or 50 nanoliter-volume reactor on a two-layered polydimethylsiloxane-on-glass microfluidic chip. Physiologically active cells were directed from the nl-reactor to a dedicated sample exposure area, where they were concentrated and reacted in 40 minutes with the target chemical by localized emission of the fluorescent reporter signal. We demonstrate the functioning of the bactosensor-chip by the automated detection of 50 μgarsenite-As l(-1) in water on consecutive days and after a one-week constant operation. Best induction of the bactosensors of 6-9-fold to 50 μg l(-1) was found at an apparent dilution rate of 0.12 h(-1) in the 50 nl microreactor. The bactosensor chip principle could be widely applicable to construct automated monitoring devices for a variety of targets in different environments.
Thielmann, Yvonne; Koepke, Juergen; Michel, Hartmut
2012-06-01
Structure determination of membrane proteins and membrane protein complexes is still a very challenging field. To facilitate the work on membrane proteins the Core Centre follows a strategy that comprises four labs of protein analytics and crystal handling, covering mass spectrometry, calorimetry, crystallization and X-ray diffraction. This general workflow is presented and a capacity of 20% of the operating time of all systems is provided to the European structural biology community within the ESFRI Instruct program. A description of the crystallization service offered at the Core Centre is given with detailed information on screening strategy, screens used and changes to adapt high throughput for membrane proteins. Our aim is to constantly develop the Core Centre towards the usage of more efficient methods. This strategy might also include the ability to automate all steps from crystallization trials to crystal screening; here we look ahead how this aim might be realized at the Core Centre.
Automated Performance Prediction of Message-Passing Parallel Programs
NASA Technical Reports Server (NTRS)
Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)
1995-01-01
The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.
Choi, Jonghyeon; Park, Yongjung; Kim, Jeong-Ho; Kim, Hyon-Suk
2011-12-01
We evaluated two new autoanalyzers, μTAS and Lumipulse for des-γ-carboxyprothrombin (DCP) assay. Analytical performance was evaluated, and the upper reference limit of the 97.5th percentile for DCP was re-established using sera from 140 healthy individuals. DCP levels were determined by the two autoanalyzers and EIA in a total of 239 sera from HCC patients (n=120) and those without HCC (n=119). Total imprecision of the two automated assays was <5% CV. Analytical measurement ranges (AMRs) were verified to be linear. The new reference limits were 29.5 mAU/mL for μTAS and 35.0 mAU/mL for Lumipulse. There were proportional and constant biases between the results from the autoanalyzers and those from EIA. The two newly developed DCP assays showed high analytical performance, but re-establishment of reference limits would be necessary. The new analyzers could be useful for clinical laboratories because of convenience of operation and wide AMRs. Copyright © 2011 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kelly, Nick; Thompson, Kate; Yeoman, Pippa
2015-01-01
This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…
Cost and schedule analytical techniques development
NASA Technical Reports Server (NTRS)
1994-01-01
This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.
Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.
Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C
2016-09-01
Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.
Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C
2014-01-01
Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patients pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (SBM), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or QCP) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patients physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patients condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.
Shaaban, Heba; Górecki, Tadeusz
2015-01-01
Green analytical chemistry is an aspect of green chemistry which introduced in the late nineties. The main objectives of green analytical chemistry are to obtain new analytical technologies or to modify an old method to incorporate procedures that use less hazardous chemicals. There are several approaches to achieve this goal such as using environmentally benign solvents and reagents, reducing the chromatographic separation times and miniaturization of analytical devices. Traditional methods used for the analysis of pharmaceutically active compounds require large volumes of organic solvents and generate large amounts of waste. Most of them are volatile and harmful to the environment. With the awareness about the environment, the development of green technologies has been receiving increasing attention aiming at eliminating or reducing the amount of organic solvents consumed everyday worldwide without loss in chromatographic performance. This review provides the state of the art of green analytical methodologies for environmental analysis of pharmaceutically active compounds in the aquatic environment with special emphasis on strategies for greening liquid chromatography (LC). The current trends of fast LC applied to environmental analysis, including elevated mobile phase temperature, as well as different column technologies such as monolithic columns, fully porous sub-2 μm and superficially porous particles are presented. In addition, green aspects of gas chromatography (GC) and supercritical fluid chromatography (SFC) will be discussed. We pay special attention to new green approaches such as automation, miniaturization, direct analysis and the possibility of locating the chromatograph on-line or at-line as a step forward in reducing the environmental impact of chromatographic analyses. Copyright © 2014 Elsevier B.V. All rights reserved.
Manoni, Fabio; Gessoni, Gianluca; Fogazzi, Giovanni Battista; Alessio, Maria Grazia; Caleffi, Alberta; Gambaro, Giovanni; Epifani, Maria Grazia; Pieretti, Barbara; Perego, Angelo; Ottomano, Cosimo; Saccani, Graziella; Valverde, Sara; Secchiero, Sandra
2016-01-01
With these guidelines the Intersociety Urinalysis Group (GIAU) aims to stimulate the following aspects: Improvement and standardization of the analytical approach to physical, chemical and morphological urine examination (ECMU). Improvement of the chemical analysis of urine with particular regard to the reconsideration of the diagnostic significance of the parameters that are traditionally evaluated in dipstick analysis together with an increasing awareness of the limits of sensitivity and specificity of this analytical method. Increase the awareness of the importance of professional skills in the field of urinary morphology and the relationship with the clinicians. Implement a policy of evaluation of the analytical quality by using, in addition to traditional internal and external controls, a program for the evaluation of morphological competence. Stimulate the diagnostics industry to focus research efforts and development methodology and instrumental catering on the needs of clinical diagnosis. The hope is to revalue the enormous diagnostic potential of 'ECMU, implementing a urinalysis on personalized diagnostic needs for each patient. Emphasize the value added to ECMU by automated analyzers for the study of the morphology of the corpuscular fraction urine. The hope is to revalue the enormous potential diagnostic of 'ECMU, implementing a urinalysis on personalized diagnostic needs that each patient brings with it.
Orazbayeva, Dina; Kenessov, Bulat; Psillakis, Elefteria; Nassyrova, Dayana; Bektassov, Marat
2018-06-22
A new, sensitive and simple method based on vacuum-assisted headspace solid-phase microextraction (Vac-HSSPME) followed by gas chromatography-mass-spectrometry (GC-MS), is proposed for the quantification of rocket fuel unsymmetrical dimethylhydrazine (UDMH) transformation products in water samples. The target transformation products were: pyrazine, 1-methyl-1H-pyrazole, N-nitrosodimethylamine, N,N-dimethylformamide, 1-methyl-1Н-1,2,4-triazole, 1-methyl-imidazole and 1H-pyrazole. For these analytes and within shorter sampling times, Vac-HSSPME yielded detection limits (0.5-100 ng L -1 ) 3-10 times lower than those reported for regular HSSPME. Vac-HSSPME sampling for 30 min at 50 °C yielded the best combination of analyte responses and their standard deviations (<15%). 1-Formyl-2,2-dimethylhydrazine and formamide were discarded because of the poor precision and accuracy when using Vac-HSSPME. The recoveries for the rest of the analytes ranged between 80 and 119%. The modified Mininert valve and Thermogreen septum could be used for automated extraction as it ensured stable analyte signals even after long waiting times (>24 h). Finally, multiple Vac-HSSME proved to be an efficient tool for controlling the matrix effect and quantifying UDMH transformation products. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry
NASA Astrophysics Data System (ADS)
Mounfield, William P.; Garrett, Timothy J.
2012-03-01
Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.
Automated MALDI matrix coating system for multiple tissue samples for imaging mass spectrometry.
Mounfield, William P; Garrett, Timothy J
2012-03-01
Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.
Winterfield, Craig; van de Voort, F R
2014-12-01
The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.
2015-12-01
We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.
2017-12-01
Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.
Weber, Emanuel; Pinkse, Martijn W. H.; Bener-Aksam, Eda; Vellekoop, Michael J.; Verhaert, Peter D. E. M.
2012-01-01
We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with MS as analytical technique, as this is one of the most powerful analysis methods for peptide detection and identification. Proof of concept was achieved using the well-known mating-factor signaling in baker's yeast, Saccharomyces cerevisiae. Our concept system holds 1 mL of cell culture medium and allows maintaining a yeast culture for, at least, 40 hours with continuous supernatant extraction (and medium replenishing). The device's small dimensions result in reduced costs for reagents and open perspectives towards full integration on-chip. Experimental data that can be obtained are time-resolved peptide profiles in a yeast culture, including information about the appearance of mating-factor-related peptides. We emphasize that the system operates without any manual intervention or pipetting steps, which allows for an improved overall sensitivity compared to non-automated alternatives. MS data confirmed previously reported aspects of the physiology of the yeast-mating process. Moreover, matingfactor breakdown products (as well as evidence for a potentially responsible protease) were found. PMID:23091722
Ghani, Milad; Font Picó, Maria Francesca; Salehinia, Shima; Palomino Cabello, Carlos; Maya, Fernando; Berlier, Gloria; Saraji, Mohammad; Cerdà, Víctor; Turnes Palomino, Gemma
2017-03-10
We present for the first time the application of metal-organic framework (MOF) mixed-matrix disks (MMD) for the automated flow-through solid-phase extraction (SPE) of environmental pollutants. Zirconium terephthalate UiO-66 and UiO-66-NH 2 MOFs with different size (90, 200 and 300nm) have been incorporated into mechanically stable polyvinylidene difluoride (PVDF) disks. The performance of the MOF-MMDs for automated SPE of seven substituted phenols prior to HPLC analysis has been evaluated using the sequential injection analysis technique. MOF-MMDs enabled the simultaneous extraction of phenols with the concomitant size exclusion of molecules of larger size. The best extraction performance was obtained using a MOF-MMD containing 90nm UiO-66-NH 2 crystals. Using the selected MOF-MMD, detection limits ranging from 0.1 to 0.2μgL -1 were obtained. Relative standard deviations ranged from 3.9 to 5.3% intra-day, and 4.7-5.7% inter-day. Membrane batch-to-batch reproducibility was from 5.2 to 6.4%. Three different groundwater samples were analyzed with the proposed method using MOF-MMDs, obtaining recoveries ranging from 90 to 98% for all tested analytes. Copyright © 2017 Elsevier B.V. All rights reserved.
Programmable, automated transistor test system
NASA Technical Reports Server (NTRS)
Truong, L. V.; Sundburg, G. R.
1986-01-01
A programmable, automated transistor test system was built to supply experimental data on new and advanced power semiconductors. The data will be used for analytical models and by engineers in designing space and aircraft electric power systems. A pulsed power technique was used at low duty cycles in a nondestructive test to examine the dynamic switching characteristic curves of power transistors in the 500 to 1000 V, 10 to 100 A range. Data collection, manipulation, storage, and output are operator interactive but are guided and controlled by the system software.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2003-09-18
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
NASA Astrophysics Data System (ADS)
Wong, Kin-Yiu; Gao, Jiali
2007-12-01
Based on Kleinert's variational perturbation (KP) theory [Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3rd ed. (World Scientific, Singapore, 2004)], we present an analytic path-integral approach for computing the effective centroid potential. The approach enables the KP theory to be applied to any realistic systems beyond the first-order perturbation (i.e., the original Feynman-Kleinert [Phys. Rev. A 34, 5080 (1986)] variational method). Accurate values are obtained for several systems in which exact quantum results are known. Furthermore, the computed kinetic isotope effects for a series of proton transfer reactions, in which the potential energy surfaces are evaluated by density-functional theory, are in good accordance with experiments. We hope that our method could be used by non-path-integral experts or experimentalists as a "black box" for any given system.
Patton, Charles J.; Kryskalla, Jennifer R.
2011-01-01
In addition to operational details and performance benchmarks for these new DA-AtNaR2 nitrate + nitrite assays, this report also provides results of interference studies for common inorganic and organic matrix constituents at 1, 10, and 100 times their median concentrations in surface-water and groundwater samples submitted annually to the NWQL for nitrate + nitrite analyses. Paired t-test and Wilcoxon signed-rank statistical analyses of results determined by CFA-CdR methods and DA-AtNaR2 methods indicate that nitrate concentration differences between population means or sign ranks were either statistically equivalent to zero at the 95 percent confidence level (p ≥ 0.05) or analytically equivalent to zero-that is, when p < 0.05, concentration differences between population means or medians were less than MDLs.
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
Low speed hybrid generalized predictive control of a gasoline-propelled car.
Romero, M; de Madrid, A P; Mañoso, C; Milanés, V
2015-07-01
Low-speed driving in traffic jams causes significant pollution and wasted time for commuters. Additionally, from the passengers׳ standpoint, this is an uncomfortable, stressful and tedious scene that is suitable to be automated. The highly nonlinear dynamics of car engines at low-speed turn its automation in a complex problem that still remains as unsolved. Considering the hybrid nature of the vehicle longitudinal control at low-speed, constantly switching between throttle and brake pedal actions, hybrid control is a good candidate to solve this problem. This work presents the analytical formulation of a hybrid predictive controller for automated low-speed driving. It takes advantage of valuable characteristics supplied by predictive control strategies both for compensating un-modeled dynamics and for keeping passengers security and comfort analytically by means of the treatment of constraints. The proposed controller was implemented in a gas-propelled vehicle to experimentally validate the adopted solution. To this end, different scenarios were analyzed varying road layouts and vehicle speeds within a private test track. The production vehicle is a commercial Citroën C3 Pluriel which has been modified to automatically act over its throttle and brake pedals. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Sheldon, E M; Downar, J B
2000-08-15
Novel approaches to the development of analytical procedures for monitoring incoming starting material in support of chemical/pharmaceutical processes are described. High technology solutions were utilized for timely process development and preparation of high quality clinical supplies. A single robust HPLC method was developed and characterized for the analysis of the key starting material from three suppliers. Each supplier used a different process for the preparation of this material and, therefore, each suppliers' material exhibited a unique impurity profile. The HPLC method utilized standard techniques acceptable for release testing in a QC/manufacturing environment. An automated experimental design protocol was used to characterize the robustness of the HPLC method. The method was evaluated for linearity, limit of quantitation, solution stability, and precision of replicate injections. An LC-MS method that emulated the release HPLC method was developed and the identities of impurities were mapped between the two methods.
NASA Technical Reports Server (NTRS)
Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther
2010-01-01
Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a chelating reagent to bring out metal ions, such as calcium and iron, which would otherwise interfere with amino acid analysis. After oxalic acid, 1 mL 0.01 N HCl and 1 mL deionized water is used to sequentially rinse the resin. Finally, the amino acids attached to the resin, and the analytes are eluted using 2.5 M NH4OH (1 mL), and the NH4OH eluent is collected in a vial for analysis.
Sreemany, Arpita; Bera, Melinda Kumar; Sarkar, Anindya
2017-12-30
The elaborate sampling and analytical protocol associated with conventional dual-inlet isotope ratio mass spectrometry has long hindered high-resolution climate studies from biogenic accretionary carbonates. Laser-based on-line systems, in comparison, produce rapid data, but suffer from unresolvable matrix effects. It is, therefore, necessary to resolve these matrix effects to take advantage of the automated laser-based method. Two marine bivalve shells (one aragonite and one calcite) and one fish otolith (aragonite) were first analysed using a CO 2 laser ablation system attached to a continuous flow isotope ratio mass spectrometer under different experimental conditions (different laser power, sample untreated vs vacuum roasted). The shells and the otolith were then micro-drilled and the isotopic compositions of the powders were measured in a dual-inlet isotope ratio mass spectrometer following the conventional acid digestion method. The vacuum-roasted samples (both aragonite and calcite) produced mean isotopic ratios (with a reproducibility of ±0.2 ‰ for both δ 18 O and δ 13 C values) almost identical to the values obtained using the conventional acid digestion method. As the isotopic ratio of the acid digested samples fall within the analytical precision (±0.2 ‰) of the laser ablation system, this suggests the usefulness of the method for studying the biogenic accretionary carbonate matrix. When using laser-based continuous flow isotope ratio mass spectrometry for the high-resolution isotopic measurements of biogenic carbonates, the employment of a vacuum-roasting step will reduce the matrix effect. This method will be of immense help to geologists and sclerochronologists in exploring short-term changes in climatic parameters (e.g. seasonality) in geological times. Copyright © 2017 John Wiley & Sons, Ltd.
Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module
NASA Technical Reports Server (NTRS)
Gay, Robert S.
2011-01-01
Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of simulated Orion landing conditions. This paper details the touchdown detection method chosen and the analysis used to support the decision.
Karadağ, Sevinç; Görüşük, Emine M; Çetinkaya, Ebru; Deveci, Seda; Dönmez, Koray B; Uncuoğlu, Emre; Doğu, Mustafa
2018-01-25
A fully automated flow injection analysis (FIA) system was developed for determination of phosphate ion in nutrient solutions. This newly developed FIA system is a portable, rapid and sensitive measuring instrument that allows on-line analysis and monitoring of phosphate ion concentration in nutrient solutions. The molybdenum blue method, which is widely used in FIA phosphate analysis, was adapted to the developed FIA system. The method is based on the formation of ammonium Mo(VI) ion by reaction of ammonium molybdate with the phosphate ion present in the medium. The Mo(VI) ion then reacts with ascorbic acid and is reduced to the spectrometrically measurable Mo(V) ion. New software specific for flow analysis was developed in the LabVIEW development environment to control all the components of the FIA system. The important factors affecting the analytical signal were identified as reagent flow rate, injection volume and post-injection flow path length, and they were optimized using Box-Behnken experimental design and response surface methodology. The optimum point for the maximum analytical signal was calculated as 0.50 mL min -1 reagent flow rate, 100 µL sample injection volume and 60 cm post-injection flow path length. The proposed FIA system had a sampling frequency of 100 samples per hour over a linear working range of 3-100 mg L -1 (R 2 = 0.9995). The relative standard deviation (RSD) was 1.09% and the limit of detection (LOD) was 0.34 mg L -1 . Various nutrient solutions from a tomato-growing hydroponic greenhouse were analyzed with the developed FIA system and the results were found to be in good agreement with vanadomolybdate chemical method findings. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.
Multiplex Quantitative Histologic Analysis of Human Breast Cancer Cell Signaling and Cell Fate
2008-05-01
stains. 15. SUBJECT TERMS Breast cancer, cell signaling, cell proliferation, histology, image analysis 16. SECURITY CLASSIFICATION OF: 17...fluorescence, and these DAPI-stained nuclei are often not counted during subsequent image analysis ). To study two analytes in the same tumor section or...analytes (p-ERK, p-AKT, Ki67) and for epithelial cytokeratin (CK), so that tumor cells may be identified during subsequent automated image analysis (as
$ANBA; a rapid, combined data acquisition and correction program for the SEMQ electron microprobe
McGee, James J.
1983-01-01
$ANBA is a program developed for rapid data acquisition and correction on an automated SEMQ electron microprobe. The program provides increased analytical speed and reduced disk read/write operations compared with the manufacturer's software, resulting in a doubling of analytical throughput. In addition, the program provides enhanced analytical features such as averaging, rapid and compact data storage, and on-line plotting. The program is described with design philosophy, flow charts, variable names, a complete program listing, and system requirements. A complete operating example and notes to assist in running the program are included.
Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca
2016-01-08
This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSD<3, n=6) meet the performance criteria required by EU regulation N. 401/2006 for the determination of the levels of mycotoxins in foodstuffs. Moreover, no matrix effects were observed in the different milk and dairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public health protection. Copyright © 2015 Elsevier B.V. All rights reserved.
An evolutionary view of chromatography data systems used in bioanalysis.
McDowall, R D
2010-02-01
This is a personal view of how chromatographic peak measurement and analyte quantification for bioanalysis have evolved from the manual methods of 1970 to the electronic working possible in 2010. In four decades there have been major changes from a simple chart recorder output (that was interpreted and quantified manually) through simple automation of peak measurement, calculation of standard curves and quality control values and instrument control to the networked chromatography data systems of today that are capable of interfacing with Laboratory Information Management Systems and other IT applications. The incorporation of electronic signatures to meet regulatory requirements offers a great opportunity for business improvement and electronic working.
Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas
2016-09-01
A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Kim, Jimin P; Xie, Zhiwei; Creer, Michael; Liu, Zhiwen; Yang, Jian
2017-01-01
Chloride is an essential electrolyte that maintains homeostasis within the body, where abnormal chloride levels in biological fluids may indicate various diseases such as Cystic Fibrosis. However, current analytical solutions for chloride detection fail to meet the clinical needs of both high performance and low material or labor costs, hindering translation into clinical settings. Here we present a new class of fluorescence chloride sensors derived from a facile citrate -based synthesis platform that utilize dynamic quenching mechanisms. Based on this low-cost platform, we demonstrate for the first time a selective sensing strategy that uses a single fluorophore to detect multiple halides simultaneously, promising both selectivity and automation to improve performance and reduce labor costs. We also demonstrate the clinical utility of citrate-based sensors as a new sweat chloride test method for the diagnosis of Cystic Fibrosis by performing analytical validation with sweat controls and clinical validation with sweat from individuals with or without Cystic Fibrosis. Lastly, molecular modeling studies reveal the structural mechanism behind chloride sensing, serving to expand this class of fluorescence sensors with improved chloride sensitivities. Thus citrate-based fluorescent materials may enable low-cost, automated multi-analysis systems for simpler, yet accurate, point-of-care diagnostics that can be readily translated into clinical settings. More broadly, a wide range of medical, industrial, and environmental applications can be achieved with such a facile synthesis platform, demonstrated in our citrate-based biodegradable polymers with intrinsic fluorescence sensing.
Hematology of healthy Florida manatees (Trichechus manatus).
Harvey, John W; Harr, Kendal E; Murphy, David; Walsh, Michael T; Nolan, Elizabeth C; Bonde, Robert K; Pate, Melanie G; Deutsch, Charles J; Edwards, Holly H; Clapp, William L
2009-06-01
Hematologic analysis is an important tool in evaluating the general health status of free-ranging manatees and in the diagnosis and monitoring of rehabilitating animals. The purpose of this study was to evaluate diagnostically important hematologic analytes in healthy manatees (Trichechus manatus) and to assess variations with respect to location (free ranging vs captive), age class (small calves, large calves, subadults, and adults), and gender. Blood was collected from 55 free-ranging and 63 captive healthy manatees. Most analytes were measured using a CELL-DYN 3500R; automated reticulocytes were measured with an ADVIA 120. Standard manual methods were used for differential leukocyte counts, reticulocyte and Heinz body counts, and plasma protein and fibrinogen concentrations. Rouleaux, slight polychromasia, stomatocytosis, and low numbers of schistocytes and nucleated RBCs (NRBCs) were seen often in stained blood films. Manual reticulocyte counts were higher than automated reticulocyte counts. Heinz bodies were present in erythrocytes of most manatees. Compared with free-ranging manatees, captive animals had slightly lower MCV, MCH, and eosinophil counts and slightly higher heterophil and NRBC counts, and fibrinogen concentration. Total leukocyte, heterophil, and monocyte counts tended to be lower in adults than in younger animals. Small calves tended to have higher reticulocyte counts and NRBC counts than older animals. Hematologic findings were generally similar between captive and free-ranging manatees. Higher manual reticulocyte counts suggest the ADVIA detects only reticulocytes containing large amounts of RNA. Higher reticulocyte and NRBC counts in young calves probably reflect an increased rate of erythropoiesis compared with older animals.
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning
D’Mello, Sidney; Dieterle, Ed; Duckworth, Angela
2017-01-01
It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in embodied theories of cognition and affect, which advocate a close coupling between thought and action. It uses machine-learned computational models to automatically infer mental states associated with engagement (e.g., interest, flow) from machine-readable behavioral and physiological signals (e.g., facial expressions, eye tracking, click-stream data) and from aspects of the environmental context. We present15 case studies that illustrate the potential of the AAA approach for measuring engagement in digital learning environments. We discuss strengths and weaknesses of the AAA approach, concluding that it has significant promise to catalyze engagement research. PMID:29038607
Systematic Assessment of the Hemolysis Index: Pros and Cons.
Lippi, Giuseppe
2015-01-01
Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning.
D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela
2017-01-01
It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in embodied theories of cognition and affect, which advocate a close coupling between thought and action. It uses machine-learned computational models to automatically infer mental states associated with engagement (e.g., interest, flow) from machine-readable behavioral and physiological signals (e.g., facial expressions, eye tracking, click-stream data) and from aspects of the environmental context. We present15 case studies that illustrate the potential of the AAA approach for measuring engagement in digital learning environments. We discuss strengths and weaknesses of the AAA approach, concluding that it has significant promise to catalyze engagement research.
Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert
2014-07-01
Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-11
A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.
Buckley, Kevin; Ryder, Alan G
2017-06-01
The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.
Measurement of late-night salivary cortisol with an automated immunoassay system.
Vogeser, Michael; Durner, Jürgen; Seliger, Ewald; Auernhammer, Christoph
2006-01-01
Measurement of late-night salivary cortisol concentrations is increasingly used as a screening test in suspected Cushing's syndrome. Cortisol concentrations are typically extremely low in late-night samples and discordant assay-specific reference ranges have been reported. Therefore, the aim of our study was to assess the analytical performance of the first automated cortisol immunoassay specified for salivary measurements and to establish late-night sampling reference-range data for this test. Salivary cortisol was measured using the Roche Cobas Cortisol assay (Roche Diagnostics). Five salivary pools in different concentration ranges were used to assess the inter-assay imprecision of this test in a two-centre evaluation protocol including two reagent lots. Linearity was tested by serial dilution. Salivary samples were obtained at 23:00 h from 100 apparently healthy volunteers using a commercially available salivary sampling device (Salivette, Sarstedt). A subset of 20 samples was used for method comparison with isotope dilution liquid chromatography-tandem mass spectrometry. Inter-assay coefficients of variation (n=20) between 11.6% and 40.4% were found for mean cortisol concentrations between 12.9 and 2.6 nmol/L, with an estimated functional sensitivity of approximately 5.0 nmol/L. The test also gave linear results in the lowest concentration range between 1.0 and 8.3 nmol/L. Mean late-night salivary cortisol of 5.0 nmol/L was found for healthy individuals; the absolute range was 1.4-16.7 nmol/L, and the 95th percentile was 8.9 nmol/L. Substantially lower concentrations were found with isotope dilution LC-MS/MS compared to immunoassay results (mean concentrations 1.8 and 4.4 nmol/L, respectively). The automated assay investigated was found to offer acceptable analytical performance in the very low concentration range required for late-night salivary cortisol, despite a very short turn-around time. Using this assay, late-night salivary cortisol concentrations below 8.9 nmol/L are typically found in healthy volunteers.
Seiden-Long, Isolde; Schnabl, Kareena; Skoropadyk, Wendy; Lennon, Nola; McKeage, Arlayne
2014-08-01
Adaptation of the Randox Enzymatic Manual UV Ammonia method to be used on the Roche Cobas 6000 (c501) automated analyzer platform. The Randox ammonia reagent was evaluated for precision, linearity, accuracy and interference from hemolysis, icterus and lipemia on the Roche c501 analyzer. Comparison studies were conducted for the Randox reagent between Roche c501, Siemens Vista, Ortho Vitros 250, and Beckman DxC methods. The Randox reagent demonstrates acceptable within-run (L1=65 μmol/L, CV 3.4% L2=168 μmol/L, CV 1.9%) and between-run precision (L1=29 μmol/L, CV 7.3% L2=102 μmol/L, CV 3.0%), Analytical Measurement Range (7-940 μmol/L), and accuracy. The method interference profile is superior for the Randox method (hemolysis index up to 600, icteric index up to 60, lipemic index up to 100) as compared to the Roche method (hemolysis index up to 200, icteric index up to 10, lipemic index up to 50). Comparison was very good between the Randox reagent and two other wet chemistry platforms. The Randox Enzymatic Manual UV Ammonia reagent is an available alternative to the Roche Cobas c501 reagent. The method is more robust to endogenous interferences and less prone to instrument error flags, thus allowing the majority of clinical specimens to be reported without additional sample handling at our institution. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Haas, Janet; Ramirez, Julio A; Carrico, Ruth M
2018-06-01
Hand hygiene is one of the most important interventions in the quest to eliminate healthcare-associated infections, and rates in healthcare facilities are markedly low. Since hand hygiene observation and feedback are critical to improve adherence, we created an easy-to-use, platform-independent hand hygiene data collection process and an automated, on-demand reporting engine. A 3-step approach was used for this project: 1) creation of a data collection form using Google Forms, 2) transfer of data from the form to a spreadsheet using Google Spreadsheets, and 3) creation of an automated, cloud-based analytics platform for report generation using R and RStudio Shiny software. A video tutorial of all steps in the creation and use of this free tool can be found on our YouTube channel: https://www.youtube.com/watch?v=uFatMR1rXqU&t. The on-demand reporting tool can be accessed at: https://crsp.louisville.edu/shiny/handhygiene. This data collection and automated analytics engine provides an easy-to-use environment for evaluating hand hygiene data; it also provides rapid feedback to healthcare workers. By reducing some of the data management workload required of the infection preventionist, more focused interventions may be instituted to increase global hand hygiene rates and reduce infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Lippi, Giuseppe; Montagnana, Martina; Giavarina, Davide
2006-01-01
Owing to remarkable advances in automation, laboratory technology and informatics, the pre-analytical phase has become the major source of variability in laboratory testing. The present survey investigated the development of several pre-analytical processes within a representative cohort of Italian clinical laboratories. A seven-point questionnaire was designed to investigate the following issues: 1a) the mean outpatient waiting time before check-in and 1b) the mean time from check-in to sample collection; 2) the mean time from sample collection to analysis; 3) the type of specimen collected for clinical chemistry testing; 4) the degree of pre-analytical automation; 5a) the number of samples shipped to other laboratories and 5b) the availability of standardised protocols for transportation; 6) the conditions for specimen storage; and 7) the availability and type of guidelines for management of unsuitable specimens. The questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. 107 questionnaires (71.3%) were returned. Data analysis revealed a high degree of variability among laboratories for the time required for check-in, outpatient sampling, sample transportation to the referral laboratory and analysis upon the arrival. Only 31% of laboratories have automated some pre-analytical steps. Of the 87% of laboratories that ship specimens to other facilities without sample preparation, 19% have no standardised protocol for transportation. For conventional clinical chemistry testing, 74% of the laboratories use serum evacuated tubes (59% with and 15% without serum separator), whereas the remaining 26% use lithium-heparin evacuated tubes (11% with and 15% without plasma separator). The storage period and conditions for rerun/retest vary widely. Only 63% of laboratories have a codified procedure for the management of unsuitable specimens, which are recognised by visual inspection (69%) or automatic detection (29%). Only 56% of the laboratories have standardised procedures for the management of unsuitable specimens, which vary widely on a local basis. The survey highlights broad heterogeneity in several pre-analytical processes among Italian laboratories. The lack of reliable guidelines encompassing evidence-based practice is a major problem for the standardisation of this crucial part of the testing process and represents a major challenge for laboratory medicine in the 2000s.
Taylor, R. Andrew; Pare, Joseph R.; Venkatesh, Arjun K.; Mowafi, Hani; Melnick, Edward R.; Fleischman, William; Hall, M. Kennedy
2018-01-01
Objectives Predictive analytics in emergency care has mostly been limited to the use of clinical decision rules (CDRs) in the form of simple heuristics and scoring systems. In the development of CDRs, limitations in analytic methods and concerns with usability have generally constrained models to a preselected small set of variables judged to be clinically relevant and to rules that are easily calculated. Furthermore, CDRs frequently suffer from questions of generalizability, take years to develop, and lack the ability to be updated as new information becomes available. Newer analytic and machine learning techniques capable of harnessing the large number of variables that are already available through electronic health records (EHRs) may better predict patient outcomes and facilitate automation and deployment within clinical decision support systems. In this proof-of-concept study, a local, big data–driven, machine learning approach is compared to existing CDRs and traditional analytic methods using the prediction of sepsis in-hospital mortality as the use case. Methods This was a retrospective study of adult ED visits admitted to the hospital meeting criteria for sepsis from October 2013 to October 2014. Sepsis was defined as meeting criteria for systemic inflammatory response syndrome with an infectious admitting diagnosis in the ED. ED visits were randomly partitioned into an 80%/20% split for training and validation. A random forest model (machine learning approach) was constructed using over 500 clinical variables from data available within the EHRs of four hospitals to predict in-hospital mortality. The machine learning prediction model was then compared to a classification and regression tree (CART) model, logistic regression model, and previously developed prediction tools on the validation data set using area under the receiver operating characteristic curve (AUC) and chi-square statistics. Results There were 5,278 visits among 4,676 unique patients who met criteria for sepsis. Of the 4,222 patients in the training group, 210 (5.0%) died during hospitalization, and of the 1,056 patients in the validation group, 50 (4.7%) died during hospitalization. The AUCs with 95% confidence intervals (CIs) for the different models were as follows: random forest model, 0.86 (95% CI = 0.82 to 0.90); CART model, 0.69 (95% CI = 0.62 to 0.77); logistic regression model, 0.76 (95% CI = 0.69 to 0.82); CURB-65, 0.73 (95% CI = 0.67 to 0.80); MEDS, 0.71 (95% CI = 0.63 to 0.77); and mREMS, 0.72 (95% CI = 0.65 to 0.79). The random forest model AUC was statistically different from all other models (p ≤ 0.003 for all comparisons). Conclusions In this proof-of-concept study, a local big data–driven, machine learning approach outperformed existing CDRs as well as traditional analytic techniques for predicting in-hospital mortality of ED patients with sepsis. Future research should prospectively evaluate the effectiveness of this approach and whether it translates into improved clinical outcomes for high-risk sepsis patients. The methods developed serve as an example of a new model for predictive analytics in emergency care that can be automated, applied to other clinical outcomes of interest, and deployed in EHRs to enable locally relevant clinical predictions. PMID:26679719
Development of Fully Automated Low-Cost Immunoassay System for Research Applications.
Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien
2017-10-01
Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.
Seamless Digital Environment – Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less
NASA Astrophysics Data System (ADS)
Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.
2017-12-01
Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.
Petruzziello, Filomena; Grand-Guillaume Perrenoud, Alexandre; Thorimbert, Anita; Fogwill, Michael; Rezzi, Serge
2017-07-18
Analytical solutions enabling the quantification of circulating levels of liposoluble micronutrients such as vitamins and carotenoids are currently limited to either single or a reduced panel of analytes. The requirement to use multiple approaches hampers the investigation of the biological variability on a large number of samples in a time and cost efficient manner. With the goal to develop high-throughput and robust quantitative methods for the profiling of micronutrients in human plasma, we introduce a novel, validated workflow for the determination of 14 fat-soluble vitamins and carotenoids in a single run. Automated supported liquid extraction was optimized and implemented to simultaneously parallelize 48 samples in 1 h, and the analytes were measured using ultrahigh-performance supercritical fluid chromatography coupled to tandem mass spectrometry in less than 8 min. An improved mass spectrometry interface hardware was built up to minimize the post-decompression volume and to allow better control of the chromatographic effluent density on its route toward and into the ion source. In addition, a specific make-up solvent condition was developed to ensure both analytes and matrix constituents solubility after mobile phase decompression. The optimized interface resulted in improved spray plume stability and conserved matrix compounds solubility leading to enhanced hyphenation robustness while ensuring both suitable analytical repeatability and improved the detection sensitivity. The overall developed methodology gives recoveries within 85-115%, as well as within and between-day coefficient of variation of 2 and 14%, respectively.
Ho, Sirikit; Lukacs, Zoltan; Hoffmann, Georg F; Lindner, Martin; Wetter, Thomas
2007-07-01
In newborn screening with tandem mass spectrometry, multiple intermediary metabolites are quantified in a single analytical run for the diagnosis of fatty-acid oxidation disorders, organic acidurias, and aminoacidurias. Published diagnostic criteria for these disorders normally incorporate a primary metabolic marker combined with secondary markers, often analyte ratios, for which the markers have been chosen to reflect metabolic pathway deviations. We applied a procedure to extract new markers and diagnostic criteria for newborn screening to the data of newborns with confirmed medium-chain acyl-CoA dehydrogenase deficiency (MCADD) and a control group from the newborn screening program, Heidelberg, Germany. We validated the results with external data of the screening center in Hamburg, Germany. We extracted new markers by performing a systematic search for analyte combinations (features) with high discriminatory performance for MCADD. To select feature thresholds, we applied automated procedures to separate controls and cases on the basis of the feature values. Finally, we built classifiers from these new markers to serve as diagnostic criteria in screening for MCADD. On the basis of chi(2) scores, we identified approximately 800 of >628,000 new analyte combinations with superior discriminatory performance compared with the best published combinations. Classifiers built with the new features achieved diagnostic sensitivities and specificities approaching 100%. Feature construction methods provide ways to disclose information hidden in the set of measured analytes. Other diagnostic tasks based on high-dimensional metabolic data might also profit from this approach.
The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franconi, Ellen; Gee, Matt; Goldberg, Miriam
Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines keymore » technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and detailed information. This information can be used to inform ongoing building operations, provide early input on energy efficiency program design, or assess the impact of efficiency by location and time of day. Stakeholders who can make use of such information include regulators, energy efficiency program administrators, program evaluators, contractors and aggregators, building owners, the investment community, and grid planners. Although each stakeholder has its own priorities and challenges related to savings measurement and verification, the potential exists for all to draw from a single set of efficiency valuation data. Such an integrated approach could provide a base consistency across stakeholder uses.« less
ETE: a python Environment for Tree Exploration.
Huerta-Cepas, Jaime; Dopazo, Joaquín; Gabaldón, Toni
2010-01-13
Many bioinformatics analyses, ranging from gene clustering to phylogenetics, produce hierarchical trees as their main result. These are used to represent the relationships among different biological entities, thus facilitating their analysis and interpretation. A number of standalone programs are available that focus on tree visualization or that perform specific analyses on them. However, such applications are rarely suitable for large-scale surveys, in which a higher level of automation is required. Currently, many genome-wide analyses rely on tree-like data representation and hence there is a growing need for scalable tools to handle tree structures at large scale. Here we present the Environment for Tree Exploration (ETE), a python programming toolkit that assists in the automated manipulation, analysis and visualization of hierarchical trees. ETE libraries provide a broad set of tree handling options as well as specific methods to analyze phylogenetic and clustering trees. Among other features, ETE allows for the independent analysis of tree partitions, has support for the extended newick format, provides an integrated node annotation system and permits to link trees to external data such as multiple sequence alignments or numerical arrays. In addition, ETE implements a number of built-in analytical tools, including phylogeny-based orthology prediction and cluster validation techniques. Finally, ETE's programmable tree drawing engine can be used to automate the graphical rendering of trees with customized node-specific visualizations. ETE provides a complete set of methods to manipulate tree data structures that extends current functionality in other bioinformatic toolkits of a more general purpose. ETE is free software and can be downloaded from http://ete.cgenomics.org.
ETE: a python Environment for Tree Exploration
2010-01-01
Background Many bioinformatics analyses, ranging from gene clustering to phylogenetics, produce hierarchical trees as their main result. These are used to represent the relationships among different biological entities, thus facilitating their analysis and interpretation. A number of standalone programs are available that focus on tree visualization or that perform specific analyses on them. However, such applications are rarely suitable for large-scale surveys, in which a higher level of automation is required. Currently, many genome-wide analyses rely on tree-like data representation and hence there is a growing need for scalable tools to handle tree structures at large scale. Results Here we present the Environment for Tree Exploration (ETE), a python programming toolkit that assists in the automated manipulation, analysis and visualization of hierarchical trees. ETE libraries provide a broad set of tree handling options as well as specific methods to analyze phylogenetic and clustering trees. Among other features, ETE allows for the independent analysis of tree partitions, has support for the extended newick format, provides an integrated node annotation system and permits to link trees to external data such as multiple sequence alignments or numerical arrays. In addition, ETE implements a number of built-in analytical tools, including phylogeny-based orthology prediction and cluster validation techniques. Finally, ETE's programmable tree drawing engine can be used to automate the graphical rendering of trees with customized node-specific visualizations. Conclusions ETE provides a complete set of methods to manipulate tree data structures that extends current functionality in other bioinformatic toolkits of a more general purpose. ETE is free software and can be downloaded from http://ete.cgenomics.org. PMID:20070885
DOT National Transportation Integrated Search
1982-01-01
The Detailed Station Model (DSM) provides operational and performance measures of alternative station configurations and management policies with respect to vehicle and passenger capabilities. It provides an analytic tool to support tradeoff studies ...
Automated drug identification system
NASA Technical Reports Server (NTRS)
Campen, C. F., Jr.
1974-01-01
System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.
Strategies for the Successful Implementation of Viral Laboratory Automation
Avivar, Cristóbal
2012-01-01
It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL’s focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory. PMID:23248733
Plikus, Maksim V; Zhang, Zina; Chuong, Cheng-Ming
2006-01-01
Background Understanding research activity within any given biomedical field is important. Search outputs generated by MEDLINE/PubMed are not well classified and require lengthy manual citation analysis. Automation of citation analytics can be very useful and timesaving for both novices and experts. Results PubFocus web server automates analysis of MEDLINE/PubMed search queries by enriching them with two widely used human factor-based bibliometric indicators of publication quality: journal impact factor and volume of forward references. In addition to providing basic volumetric statistics, PubFocus also prioritizes citations and evaluates authors' impact on the field of search. PubFocus also analyses presence and occurrence of biomedical key terms within citations by utilizing controlled vocabularies. Conclusion We have developed citations' prioritisation algorithm based on journal impact factor, forward referencing volume, referencing dynamics, and author's contribution level. It can be applied either to the primary set of PubMed search results or to the subsets of these results identified through key terms from controlled biomedical vocabularies and ontologies. NCI (National Cancer Institute) thesaurus and MGD (Mouse Genome Database) mammalian gene orthology have been implemented for key terms analytics. PubFocus provides a scalable platform for the integration of multiple available ontology databases. PubFocus analytics can be adapted for input sources of biomedical citations other than PubMed. PMID:17014720
Strategies for the successful implementation of viral laboratory automation.
Avivar, Cristóbal
2012-01-01
It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL's focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory.
NASA Technical Reports Server (NTRS)
Hwang, Emma Y.; Pappas, Dimitri; Jeevarajan, Antony S.; Anderson, Melody M.
2004-01-01
BACKGROUND: Compact and automated sensors are desired for assessing the health of cell cultures in biotechnology experiments. While several single-analyte sensors exist to measure culture health, a multi-analyte sensor would simplify the cell culture system. One such multi-analyte sensor, the Paratrend 7 manufactured by Diametrics Medical, consists of three optical fibers for measuring pH, dissolved carbon dioxide (pCO(2)), dissolved oxygen (pO(2)), and a thermocouple to measure temperature. The sensor bundle was designed for intra-vascular measurements in clinical settings, and can be used in bioreactors operated both on the ground and in NASA's Space Shuttle and International Space Station (ISS) experiments. METHODS: A Paratrend 7 sensor was placed at the outlet of a bioreactor inoculated with BHK-21 (baby hamster kidney) cells. The pH, pCO(2), pO(2), and temperature data were transferred continuously to an external computer. Cell culture medium, manually extracted from the bioreactor through a sampling port, was also assayed using a bench top blood gas analyzer (BGA). RESULTS: Two Paratrend 7 sensors were used over a single cell culture experiment (64 days). When compared to the manually obtained BGA samples, the sensor had good agreement for pH, pCO(2), and pO(2) with bias (and precision) 0.005(0.024), 8.0 mmHg (4.4 mmHg), and 11 mmHg (17 mmHg), respectively for the first two sensors. A third Paratrend sensor (operated for 141 days) had similar agreement (0.02+/-0.15 for pH, -4+/-8 mm Hg for pCO(2), and 24+/-18 mmHg for pO(2)). CONCLUSION: The resulting biases and precisions are com- parable to Paratrend sensor clinical results. Although the pO(2) differences may be acceptable for clinically relevant measurement ranges, the O(2) sensor in this bundle may not be reliable enough for the ranges of pO(2) in these cell culture studies without periodic calibration.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Biosensor technology: technology push versus market pull.
Luong, John H T; Male, Keith B; Glennon, Jeremy D
2008-01-01
Biosensor technology is based on a specific biological recognition element in combination with a transducer for signal processing. Since its inception, biosensors have been expected to play a significant analytical role in medicine, agriculture, food safety, homeland security, environmental and industrial monitoring. However, the commercialization of biosensor technology has significantly lagged behind the research output as reflected by a plethora of publications and patenting activities. The rationale behind the slow and limited technology transfer could be attributed to cost considerations and some key technical barriers. Analytical chemistry has changed considerably, driven by automation, miniaturization, and system integration with high throughput for multiple tasks. Such requirements pose a great challenge in biosensor technology which is often designed to detect one single or a few target analytes. Successful biosensors must be versatile to support interchangeable biorecognition elements, and in addition miniaturization must be feasible to allow automation for parallel sensing with ease of operation at a competitive cost. A significant upfront investment in research and development is a prerequisite in the commercialization of biosensors. The progress in such endeavors is incremental with limited success, thus, the market entry for a new venture is very difficult unless a niche product can be developed with a considerable market volume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Shicheng; Department of Environmental Science and Engineering, Fudan University, Shanghai 200433; Cai Lingshuang
2009-05-23
Characterization and quantification of livestock odorants is one of the most challenging analytical tasks because odor-causing gases are very reactive, polar and often present at very low concentrations in a complex matrix of less important or irrelevant gases. The objective of this research was to develop a novel analytical method for characterization of the livestock odorants including their odor character, odor intensity, and hedonic tone and to apply this method for quantitative analysis of the key odorants responsible for livestock odor. Sorbent tubes packed with Tenax TA were used for field sampling. The automated one-step thermal desorption module coupled withmore » multidimensional gas chromatography-mass spectrometry/olfactometry system was used for simultaneous chemical and odor analysis. Fifteen odorous VOCs and semi-VOCs identified from different livestock species operations were quantified. Method detection limits ranges from 40 pg for skatole to 3590 pg for acetic acid. In addition, odor character, odor intensity and hedonic tone associated with each of the target odorants are also analyzed simultaneously. We found that the mass of each VOCs in the sample correlates well with the log stimulus intensity. All of the correlation coefficients (R{sup 2}) are greater than 0.74, and the top 10 correlation coefficients were greater than 0.90.« less
Guale, Fessessework; Shahreza, Shahriar; Walterscheid, Jeffrey P.; Chen, Hsin-Hung; Arndt, Crystal; Kelly, Anna T.; Mozayani, Ashraf
2013-01-01
Liquid chromatography time-of-flight mass spectrometry (LC–TOF-MS) analysis provides an expansive technique for identifying many known and unknown analytes. This study developed a screening method that utilizes automated solid-phase extraction to purify a wide array of analytes involving stimulants, benzodiazepines, opiates, muscle relaxants, hypnotics, antihistamines, antidepressants and newer synthetic “Spice/K2” cannabinoids and cathinone “bath salt” designer drugs. The extract was applied to LC–TOF-MS analysis, implementing a 13 min chromatography gradient with mobile phases of ammonium formate and methanol using positive mode electrospray. Several common drugs and metabolites can share the same mass and chemical formula among unrelated compounds, but they are structurally different. In this method, the LC–TOF-MS was able to resolve many isobaric compounds by accurate mass correlation within 15 ppm mass units and a narrow retention time interval of less than 10 s of separation. Drug recovery yields varied among spiked compounds, but resulted in overall robust area counts to deliver an average match score of 86 when compared to the retention time and mass of authentic standards. In summary, this method represents a rapid, enhanced screen for blood and urine specimens in postmortem, driving under the influence, and drug facilitated sexual assault forensic toxicology casework. PMID:23118149
Guale, Fessessework; Shahreza, Shahriar; Walterscheid, Jeffrey P; Chen, Hsin-Hung; Arndt, Crystal; Kelly, Anna T; Mozayani, Ashraf
2013-01-01
Liquid chromatography time-of-flight mass spectrometry (LC-TOF-MS) analysis provides an expansive technique for identifying many known and unknown analytes. This study developed a screening method that utilizes automated solid-phase extraction to purify a wide array of analytes involving stimulants, benzodiazepines, opiates, muscle relaxants, hypnotics, antihistamines, antidepressants and newer synthetic "Spice/K2" cannabinoids and cathinone "bath salt" designer drugs. The extract was applied to LC-TOF-MS analysis, implementing a 13 min chromatography gradient with mobile phases of ammonium formate and methanol using positive mode electrospray. Several common drugs and metabolites can share the same mass and chemical formula among unrelated compounds, but they are structurally different. In this method, the LC-TOF-MS was able to resolve many isobaric compounds by accurate mass correlation within 15 ppm mass units and a narrow retention time interval of less than 10 s of separation. Drug recovery yields varied among spiked compounds, but resulted in overall robust area counts to deliver an average match score of 86 when compared to the retention time and mass of authentic standards. In summary, this method represents a rapid, enhanced screen for blood and urine specimens in postmortem, driving under the influence, and drug facilitated sexual assault forensic toxicology casework.
NASA Astrophysics Data System (ADS)
Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere
2006-02-01
To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.
Turbulent Motion of Liquids in Hydraulic Resistances with a Linear Cylindrical Slide-Valve
Velescu, C.; Popa, N. C.
2015-01-01
We analyze the motion of viscous and incompressible liquids in the annular space of controllable hydraulic resistances with a cylindrical linear slide-valve. This theoretical study focuses on the turbulent and steady-state motion regimes. The hydraulic resistances mentioned above are the most frequent type of hydraulic resistances used in hydraulic actuators and automation systems. To study the liquids' motion in the controllable hydraulic resistances with a linear cylindrical slide-valve, the report proposes an original analytic method. This study can similarly be applied to any other type of hydraulic resistance. Another purpose of this study is to determine certain mathematical relationships useful to approach the theoretical functionality of hydraulic resistances with magnetic controllable fluids as incompressible fluids in the presence of a controllable magnetic field. In this report, we established general analytic equations to calculate (i) velocity and pressure distributions, (ii) average velocity, (iii) volume flow rate of the liquid, (iv) pressures difference, and (v) radial clearance. PMID:26167532
Turbulent Motion of Liquids in Hydraulic Resistances with a Linear Cylindrical Slide-Valve.
Velescu, C; Popa, N C
2015-01-01
We analyze the motion of viscous and incompressible liquids in the annular space of controllable hydraulic resistances with a cylindrical linear slide-valve. This theoretical study focuses on the turbulent and steady-state motion regimes. The hydraulic resistances mentioned above are the most frequent type of hydraulic resistances used in hydraulic actuators and automation systems. To study the liquids' motion in the controllable hydraulic resistances with a linear cylindrical slide-valve, the report proposes an original analytic method. This study can similarly be applied to any other type of hydraulic resistance. Another purpose of this study is to determine certain mathematical relationships useful to approach the theoretical functionality of hydraulic resistances with magnetic controllable fluids as incompressible fluids in the presence of a controllable magnetic field. In this report, we established general analytic equations to calculate (i) velocity and pressure distributions, (ii) average velocity, (iii) volume flow rate of the liquid, (iv) pressures difference, and (v) radial clearance.
Riley, Paul W.; Gallea, Benoit; Valcour, Andre
2017-01-01
Background: Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. Methods: The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Results: Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. Conclusions: To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process. PMID:28706751
Fast mass spectrometry-based enantiomeric excess determination of proteinogenic amino acids.
Fleischer, Heidi; Thurow, Kerstin
2013-03-01
A rapid determination of the enantiomeric excess of proteinogenic amino acids is of great importance in various fields of chemical and biologic research and industries. Owing to their different biologic effects, enantiomers are interesting research subjects in drug development for the design of new and more efficient pharmaceuticals. Usually, the enantiomeric composition of amino acids is determined by conventional analytical methods such as liquid or gas chromatography or capillary electrophoresis. These analytical techniques do not fulfill the requirements of high-throughput screening due to their relative long analysis times. The method presented allows a fast analysis of chiral amino acids without previous time consuming chromatographic separation. The analytical measurements base on parallel kinetic resolution with pseudoenantiomeric mass tagged auxiliaries and were carried out by mass spectrometry with electrospray ionization. All 19 chiral proteinogenic amino acids were tested and Pro, Ser, Trp, His, and Glu were selected as model substrates for verification measurements. The enantiomeric excesses of amino acids with non-polar and aliphatic side chains as well as Trp and Phe (aromatic side chains) were determined with maximum deviations of the expected value less than or equal to 10ee%. Ser, Cys, His, Glu, and Asp were determined with deviations lower or equal to 14ee% and the enantiomeric excess of Tyr were calculated with 17ee% deviation. The total screening process is fully automated from the sample pretreatment to the data processing. The method presented enables fast measurement times about 1.38 min per sample and is applicable in the scope of high-throughput screenings.
Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho
2017-11-01
High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.
Serrano, María; Gallego, Mercedes; Silva, Manuel
2016-03-11
Endogenous aldehydes (EAs) generated during oxidative stress and cell processes are associated with many pathogenic and toxicogenic processes. The aim of this research was to develop a solvent-free and automated analytical method for the determination of EAs in human urine using a static headspace generator sampler coupled with gas chromatography-mass spectrometry (HS-GC-MS). Twelve significant EAs used as markers of different biochemical and physiological processes, namely short- and medium-chain alkanals, α,β-unsaturated aldehydes and dicarbonyl aldehydes have been selected as target analytes. Human urine samples (no dilution is required) were derivatized with O-2,3,4,5,6-pentafluorobenzylhydroxylamine in alkaline medium (hydrogen carbonate-carbonate buffer, pH 10.3). The HS-GC-MS method developed renders an efficient tool for the sensitive and precise determination of EAs in human urine with limits of detection from 1 to 15ng/L and relative standard deviations, (RSDs) from 6.0 to 7.9%. Average recoveries by enriching urine samples ranged between 92 and 95%. Aldehydes were readily determined at 0.005-50μg/L levels in human urine from healthy subjects, smokers and diabetic adults. Copyright © 2016 Elsevier B.V. All rights reserved.
Küme, Tuncay; Sağlam, Barıs; Ergon, Cem; Sisman, Ali Rıza
2018-01-01
The aim of this study is to evaluate and compare the analytical performance characteristics of the two creatinine methods based on the Jaffe and enzymatic methods. Two original creatinine methods, Jaffe and enzymatic, were evaluated on Architect c16000 automated analyzer via limit of detection (LOD) and limit of quantitation (LOQ), linearity, intra-assay and inter-assay precision, and comparability in serum and urine samples. The method comparison and bias estimation using patient samples according to CLSI guideline were performed on 230 serum and 141 urine samples by analyzing on the same auto-analyzer. The LODs were determined as 0.1 mg/dL for both serum methods and as 0.25 and 0.07 mg/dL for the Jaffe and the enzymatic urine method respectively. The LOQs were similar with 0.05 mg/dL value for both serum methods, and enzymatic urine method had a lower LOQ than Jaffe urine method, values at 0.5 and 2 mg/dL respectively. Both methods were linear up to 65 mg/dL for serum and 260 mg/dL for urine. The intra-assay and inter-assay precision data were under desirable levels in both methods. The higher correlations were determined between two methods in serum and urine (r=.9994, r=.9998 respectively). On the other hand, Jaffe method gave the higher creatinine results than enzymatic method, especially at the low concentrations in both serum and urine. Both Jaffe and enzymatic methods were found to meet the analytical performance requirements in routine use. However, enzymatic method was found to have better performance in low creatinine levels. © 2017 Wiley Periodicals, Inc.
Harik-Khan, R; Moats, W A
1995-01-01
A procedure for identifying and quantitating violative beta-lactams in milk is described. This procedure integrates beta-lactam residue detection kits with the multiresidue automated liquid chromatographic (LC) cleanup method developed in our laboratory. Spiked milk was deproteinized, extracted, and subjected to reversed-phase LC using a gradient program that concentrated the beta-lactams. Amoxicillin, ampicillin, cephapirin, ceftiofur, cloxacillin, and penicillin G were, thus, separated into 5 fractions that were subsequently tested for activity by using 4 kits. beta-lactams in the positive fractions were quantitated by analytical LC methods developed in our laboratory. The LC cleanup method separated beta-lactam antibiotics from each other and from interferences in the matrix and also concentrated the antibiotics, thus increasing the sensitivity of the kits to the beta-lactam antibiotics. The procedure facilitated the task of identifying and measuring the beta-lactam antibiotics that may be present in milk samples.
Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples
2012-01-01
Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466
Controlling Wafer Contamination Using Automated On-Line Metrology during Wet Chemical Cleaning
NASA Astrophysics Data System (ADS)
Wang, Jason; Kingston, Skip; Han, Ye; Saini, Harmesh; McDonald, Robert; Mui, Rudy
2003-09-01
The capabilities of a trace contamination analyzer are discussed and demonstrated. This analytical tool utilizes an electrospray, time-of-flight mass spectrometer (ES-TOF-MS) for fully automated on-line monitoring of wafer cleaning solutions. The analyzer provides rich information on metallic, anionic, cationic, elemental, and organic species through its ability to provide harsh (elemental) and soft (molecular) ionization under both positive and negative modes. It is designed to meet semiconductor process control and yield management needs for the ever increasing complex new chemistries present in wafer fabrication.
NASA Technical Reports Server (NTRS)
1983-01-01
In the mid 60s under contract with NASA, Dr. Benjamin W. Grunbaum was responsible for the development of an automated electrophoresis device that would work in the weightless environment of space. The device was never used in space but was revived during the mid 70s as a technology utilization project aimed at an automated system for use on Earth. The advanced system became known as the Grunbaum System for electrophoresis. It is a versatile, economical assembly for rapid separation of specific blood proteins in very small quantities, permitting their subsequent identification and quantification.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
2002-12-19
NASA Dryden's Automated Aerial Refueling (AAR) project evaluated the capability of an F/A-18A aircraft as an in-flight refueling tanker with the objective of developing analytical models for an automated aerial refueling system for unmanned air vehicles. The F/A-18 "tanker" aircraft (No. 847) underwent flight test envelope expansion with an aerodynamic pod containing air-refueling equipment carried beneath the fuselage. The second aircraft (No. 843) flew as the receiver aircraft during the study to assess the free-stream hose and drogue dynamics on the F/A-18A.
Improvement in the stability of serum samples stored in an automated refrigerated module.
Parra-Robert, Marina; Rico-Santana, Naira; Alcaraz-Quiles, José; Sandalinas, Silvia; Fernández, Esther; Falcón, Isabel; Pérez-Riedweg, Margarita; Bedini, Josep Lluís
2016-12-01
In clinical laboratories it is necessary to know for how long the analytes are stable in the samples with specific storage conditions. Our laboratory has implemented the new Aptio Automation System (AAS) (Siemens Healthcare Diagnostics) where the analyzed samples are stored in a refrigerated storage module (RSM) after being sealed. The aim of the study was to evaluate the stability of serum samples with the AAS and comparing the results with a previous study using a conventional refrigerated system. Serum samples from a total of 50 patients were collected and for each of them 27 biochemical analytes were analyzed. The samples were divided in 5 sets of 10 samples. Each set was re-analyzed at one of the following times: 24, 48, 72, 96 and 120h. Stability was evaluated according to the Total Limit of Change (TLC) criteria, which combine both analytical and biologic variation. A total of 26 out of 27 analytes were stable at the end of the study according to TLC criteria. Lactate dehydrogenase was not stable at 48h observing a decrease in its concentration until the end of the study. In the previous study (conventional storage system) 9 biochemical analytes were not stable with an increase of their levels due to the evaporation process. The RSM connected to the AAS improves the stability of serum samples. This system avoids the evaporation process due to the sealing of samples and allows better control of the samples during their storage. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele
2014-06-01
The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD < 11%) and matrix effects ranged from 1 to 26% when compensated with the internal standard. The limits of quantification ranged from 3 to 25 ng/mL depending on the compound. No cross-contamination in the automated SPE system was observed. The extracted samples were stable for 72 h in the autosampler (4°C). This method was applied to authentic samples (from forensic and toxicology cases) and to proficiency testing schemes containing cocaine, heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample handling and less time-consuming procedures.
Hernández-Zavala, Araceli; Matoušek, Tomáš; Drobná, Zuzana; Paul, David S.; Walton, Felecia; Adair, Blakely M.; Jiří, Dědina; Thomas, David J.
2008-01-01
Analyses of arsenic (As) species in tissues and body fluids of individuals chronically exposed to inorganic arsenic (iAs) provide essential information about the exposure level and pattern of iAs metabolism. We have previously described an oxidation state-specific analysis of As species in biological matrices by hydride-generation atomic absorption spectrometry (HG-AAS), using cryotrapping (CT) for preconcentration and separation of arsines. To improve performance and detection limits of the method, HG and CT steps are automated and a conventional flame-in-tube atomizer replaced with a recently developed multiple microflame quartz tube atomizer (multiatomizer). In this system, arsines from AsIII-species are generated in a mixture of Tris-HCl (pH 6) and sodium borohydride. For generation of arsines from both AsIII- and AsV-species, samples are pretreated with L-cysteine. Under these conditions, dimethylthioarsinic acid, a newly described metabolite of iAs, does not interfere significantly with detection and quantification of methylated trivalent arsenicals. Analytical performance of the automated HG-CT-AAS was characterized by analyses of cultured cells and mouse tissues that contained mono- and dimethylated metabolites of iAs. The capacity to detect methylated AsIII- and AsV-species was verified, using an in vitro methylation system containing recombinant rat arsenic (+3 oxidation state) methyltransferase and cultured rat hepatocytes treated with iAs. Compared with the previous HG-CT-AAS design, detection limits for iAs and its metabolites have improved significantly with the current system, ranging from 8 to 20 pg. Recoveries of As were between 78 and 117%. The precision of the method was better than 5% for all biological matrices examined. Thus, the automated HG-CT-AAS system provides an effective and sensitive tool for analysis of all major human metabolites of iAs in complex biological matrices. PMID:18677417
Li, Xiangtang; Zhao, Shulin; Hu, Hankun; Liu, Yi-Ming
2016-06-17
Capillary electrophoresis-based single cell analysis has become an essential approach in researches at the cellular level. However, automation of single cell analysis has been a challenge due to the difficulty to control the number of cells injected and the irreproducibility associated with cell aggregation. Herein we report the development of a new microfluidic platform deploying the double nano-electrode cell lysis technique for automated analysis of single cells with mass spectrometric detection. The proposed microfluidic chip features integration of a cell-sized high voltage zone for quick single cell lysis, a microfluidic channel for electrophoretic separation, and a nanoelectrospray emitter for ionization in MS detection. Built upon this platform, a microchip electrophoresis-mass spectrometric method (MCE-MS) has been developed for automated single cell analysis. In the method, cell introduction, cell lysis, and MCE-MS separation are computer controlled and integrated as a cycle into consecutive assays. Analysis of large numbers of individual PC-12 neuronal cells (both intact and exposed to 25mM KCl) was carried out to determine intracellular levels of dopamine (DA) and glutamic acid (Glu). It was found that DA content in PC-12 cells was higher than Glu content, and both varied from cell to cell. The ratio of intracellular DA to Glu was 4.20±0.8 (n=150). Interestingly, the ratio drastically decreased to 0.38±0.20 (n=150) after the cells are exposed to 25mM KCl for 8min, suggesting the cells released DA promptly and heavily while they released Glu at a much slower pace in response to KCl-induced depolarization. These results indicate that the proposed MCE-MS analytical platform may have a great potential in researches at the cellular level. Copyright © 2016 Elsevier B.V. All rights reserved.
A Skills Approach to Career Development.
ERIC Educational Resources Information Center
Grites, Thomas J.
1983-01-01
A counseling approach encourages students' development of job-applicable, career-transferable skills to meet the changing demands of specialization, automation, mobility, urban growth, and industrial trends in the job market. These include writing; speaking; research; and analytical, organizational, leadership, interpersonal, and quantitative…
Performance characteristics of the ARCHITECT Active-B12 (Holotranscobalamin) assay.
Merrigan, Stephen D; Owen, William E; Straseski, Joely A
2015-01-01
Vitamin B12 (cobalamin) is a necessary cofactor in methionine and succinyl-CoA metabolism. Studies estimate the deficiency prevalence as high as 30% in the elderly population. Ten to thirty percent of circulating cobalamin is bound to transcobalamin (holotranscobalamin, holoTC) which can readily enter cells and is therefore considered the bioactive form. The objective of our study was to evaluate the analytical performance of a high-throughput, automated holoTC assay (ARCHITECT i2000(SR) Active-B12 (Holotranscobalamin)) and compare it to other available methods. Manufacturer-specified limits of blank (LoB), detection (LoD), and quantitation (LoQ), imprecision, interference, and linearity were evaluated for the ARCHITECT HoloTC assay. Residual de-identified serum samples were used to compare the ARCHITECT HoloTC assay with the automated AxSYM Active-B12 (Holotranscobalamin) assay (Abbott Diagnostics) and the manual Active-B12 (Holotranscobalamin) Enzyme Immunoassay (EIA) (Axis-Shield Diagnostics, Dundee, Scotland, UK). Manufacturer's claims of LoB, LoD, LoQ, imprecision, interference, and linearity to the highest point tested (113.4 pmol/L) were verified for the ARCHITECT HoloTC assay. Method comparison of the ARCHITECT HoloTC to the AxSYM HoloTC produced the following Deming regression statistics: (ARCHITECT(HoloTc)) = 0.941 (AxSYM(HoloTC)) + 1.2 pmol/L, S(y/x) = 6.4, r = 0.947 (n = 98). Comparison to the Active-B12 EIA produced: (ARCHITECT(HoloTC)) = 1.105 (EIA(Active-B12)) - 6.8 pmol/L, S(y/x) = 11.0, r = 0.950 (n = 221). This assay performed acceptably for LoB, LoD, LoQ, imprecision, interference, linearity and method comparison to the predicate device (AxSYM). An additional comparison to a manual Active-B12 EIA method performed similarly, with minor exceptions. This study determined that the ARCHITECT HoloTC assay is suitable for routine clinical use, which provides a high-throughput alternative for automated testing of this emerging marker of cobalamin deficiency.
Wang, Xin; Garcia, Carlos T; Gong, Guanyu; Wishnok, John S; Tannenbaum, Steven R
2018-02-06
S-Nitrosothiols (RSNOs) constitute a circulating endogenous reservoir of nitric oxide and have important biological activities. In this study, an online coupling of solid-phase derivatization (SPD) with liquid chromatography-mass spectrometry (LC-MS) was developed and applied in the analysis of low-molecular-mass RSNOs. A derivatizing-reagent-modified polymer monolithic column was prepared and adapted for online SPD-LC-MS. Analytes from the LC autosampler flowed through the monolithic column for derivatization and then directly into the LC-MS for analysis. This integration of the online derivatization, LC separation, and MS detection facilitated system automation, allowing rapid, laborsaving, and sensitive detection of RSNOs. S-Nitrosoglutathione (GSNO) was quantified using this automated online method with good linearity (R 2 = 0.9994); the limit of detection was 0.015 nM. The online SPD-LC-MS method has been used to determine GSNO levels in mouse samples, 138 ± 13.2 nM of endogenous GSNO was detected in mouse plasma. Besides, the GSNO concentrations in liver (64.8 ± 11.3 pmol/mg protein), kidney (47.2 ± 6.1 pmol/mg protein), heart (8.9 ± 1.8 pmol/mg protein), muscle (1.9 ± 0.3 pmol/mg protein), hippocampus (5.3 ± 0.9 pmol/mg protein), striatum (6.7 ± 0.6 pmol/mg protein), cerebellum (31.4 ± 6.5 pmol/mg protein), and cortex (47.9 ± 4.6 pmol/mg protein) were also successfully quantified. When the derivatization was performed within 8 min, followed by LC-MS detection, samples could be rapidly analyzed compared with the offline manual method. Other low-molecular-mass RSNOs, such as S-nitrosocysteine and S-nitrosocysteinylglycine, were captured by rapid precursor-ion scanning, showing that the proposed method is a potentially powerful tool for capture, identification, and quantification of RSNOs in biological samples.
Schaefer, Kristin E; Chen, Jessie Y C; Szalma, James L; Hancock, P A
2016-05-01
We used meta-analysis to assess research concerning human trust in automation to understand the foundation upon which future autonomous systems can be built. Trust is increasingly important in the growing need for synergistic human-machine teaming. Thus, we expand on our previous meta-analytic foundation in the field of human-robot interaction to include all of automation interaction. We used meta-analysis to assess trust in automation. Thirty studies provided 164 pairwise effect sizes, and 16 studies provided 63 correlational effect sizes. The overall effect size of all factors on trust development was ḡ = +0.48, and the correlational effect was [Formula: see text] = +0.34, each of which represented medium effects. Moderator effects were observed for the human-related (ḡ = +0.49; [Formula: see text] = +0.16) and automation-related (ḡ = +0.53; [Formula: see text] = +0.41) factors. Moderator effects specific to environmental factors proved insufficient in number to calculate at this time. Findings provide a quantitative representation of factors influencing the development of trust in automation as well as identify additional areas of needed empirical research. This work has important implications to the enhancement of current and future human-automation interaction, especially in high-risk or extreme performance environments. © 2016, Human Factors and Ergonomics Society.
Total plasma magnesium in healthy and critically ill foals.
Mariella, J; Isani, G; Andreani, G; Freccero, F; Carpenè, E; Castagnetti, C
2016-01-15
Abnormalities in total Mg (tMg) concentration in plasma and/or serum are common in critically ill humans, and the association with increased mortality has been documented in several clinical studies in adults and newborns with hypoxic-ischemic encephalopathy. Abnormalities in tMg were studied in hospitalized dogs, cats, and adult horses. Newborn foals were scarcely studied with regard to Mg concentration. The aims of the present study were: (1) to compare two analytical methods for the determination of tMg in plasma: the automated colorimetric method and the atomic absorption spectrometry; (2) to measure plasma tMg in healthy foals during the first 72 hours after birth and in sick foals during the first 72 hours of hospitalization; (3) to compare total plasma Mg concentration among healthy foals, foals affected by perinatal asphyxia syndrome (PAS), prematurity and/or dismaturity, and sepsis; (4) to evaluate tMg plasma concentration in surviving and non-surviving foals. One hundred seventeen foals were included in the study: 20 healthy and 97 sick foals. The automated method used in clinical practice probably overestimates plasma tMg. Due to its higher sensitivity and specificity, the atomic absorption spectrometry should be considered the method of choice from an analytical point of view, but requires an instrumentation not easily available in any laboratory and specific technical skills and competencies. Plasma tMg in healthy foals were included in the range 0.52 to 1.01 mmol/L and did not show any time-dependent change during the first 72 hours of life. In sick foals, tMg evaluated at T0 was statistically higher than tMg measured at subsequent times. Foals affected by PAS had a tMg at T0 significantly higher (P < 0.01) than healthy, septic, and premature and/or dysmature foals. The t test found significantly higher (P < 0.01) plasma tMg measured at T0 in non-surviving than in surviving foals. Plasma tMg could be a useful parameter for the diagnosis of PAS and the formulation of the prognosis in critically ill foals. Copyright © 2016 Elsevier Inc. All rights reserved.
Ab initio simulation of diffractometer instrumental function for high-resolution X-ray diffraction1
Mikhalychev, Alexander; Benediktovitch, Andrei; Ulyanenkova, Tatjana; Ulyanenkov, Alex
2015-01-01
Modeling of the X-ray diffractometer instrumental function for a given optics configuration is important both for planning experiments and for the analysis of measured data. A fast and universal method for instrumental function simulation, suitable for fully automated computer realization and describing both coplanar and noncoplanar measurement geometries for any combination of X-ray optical elements, is proposed. The method can be identified as semi-analytical backward ray tracing and is based on the calculation of a detected signal as an integral of X-ray intensities for all the rays reaching the detector. The high speed of calculation is provided by the expressions for analytical integration over the spatial coordinates that describe the detection point. Consideration of the three-dimensional propagation of rays without restriction to the diffraction plane provides the applicability of the method for noncoplanar geometry and the accuracy for characterization of the signal from a two-dimensional detector. The correctness of the simulation algorithm is checked in the following two ways: by verifying the consistency of the calculated data with the patterns expected for certain simple limiting cases and by comparing measured reciprocal-space maps with the corresponding maps simulated by the proposed method for the same diffractometer configurations. Both kinds of tests demonstrate the agreement of the simulated instrumental function shape with the measured data. PMID:26089760
Lee, E.A.; Strahan, A.P.; Thurman, E.M.
2001-01-01
An analytical method for the determination of glyphosate, its principal degradation compound, aminomethylphosphonic acid (AMPA), and glufosinate in water with varying matrices has been developed. Four different sample matrices fortified at 0.2 and 2.0 ?g/L (micrograms per liter) were analyzed using precolumn derivatization with 9-fluorenylmethylchloroformate (FMOC). After derivatization, cleanup and concentration were accomplished using automated online solid-phase extraction followed by elution with the mobile phase allowing for direct injection into a liquid chromatograph/mass spectrometer (LC/MS). Analytical conditions for MS detection were optimized, and quantitation was carried out using the following representative ions: 390 and 168 for glyphosate; 332, 110, and 136 for AMPA; and 402, 180, and 206 for glufosinate. Matrix effects were minimized by utilizing standard addition for quantification and an isotope-labeled glyphosate (2-13C,15N) as the internal standard. Method detection limits (MDLs) were 0.084 ?g/L for glyphosate, 0.078 ?g/L for AMPA, and 0.057 ?g/L for glufosinate. The method reporting limits (MRLs) were set at 0.1 ?g/L for all three compounds. The mean recovery values ranged from 88.0 to 128.7 percent, and relative standard deviation values ranged from 5.6 to 32.6 percent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozanich, Rich M.; Antolick, Kathryn C.; Bruckner-Lea, Cindy J.
2007-09-15
Automated devices and methods for biological sample preparation often utilize surface functionalized microbeads (superparamagnetic or non-magnetic) to allow capture, purification and pre-concentration of trace amounts of proteins, cells, or nucleic acids (DNA/RNA) from complex samples. We have developed unique methods and hardware for trapping either magnetic or non-magnetic functionalized beads that allow samples and reagents to be efficiently perfused over a micro-column of beads. This approach yields enhanced mass transport and up to 5-fold improvements in assay sensitivity or speed, dramatically improving assay capability relative to assays conducted in more traditional “batch modes” (i.e., in tubes or microplate wells). Summarymore » results are given that highlight the analytical performance improvements obtained for automated microbead processing systems utilizing novel microbead trap/flow-cells for various applications, including: 1) simultaneous capture of multiple cytokines using an antibody-coupled polystyrene bead assay with subsequent flow cytometry detection; 2) capture of nucleic acids using oligonucleotide coupled polystyrene beads with flow cytometry detection; and 3) capture of Escherichia coli 0157:H7 (E. coli) from 50 mL sample volumes using antibody-coupled superparamagnetic microbeads with subsequent culturing to assess capture efficiency.« less
Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel
2016-08-02
In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.
Duffy, G; Regan, F
2017-11-20
The demand for autonomous sensors for unattended, continuous nutrient monitoring in water is rapidly growing with the increasing need for more frequent and widespread environmental pollution monitoring. Legislative bodies, local authorities and industries all require frequent water quality monitoring, however, this is time and labour intensive, and an expensive undertaking. Autonomous sensors allow for frequent, unattended data collection. While this solves the time and labour intensive aspects of water monitoring, sensors can be very expensive. Development of low-cost sensors is essential to realise the concept of Internet of Things (IoT). However there is much work yet to be done in this field. This article reviews current literature on the research and development efforts towards deployable autonomous sensors for phosphorus (in the form of phosphate) and nitrogen (in the form of nitrate), with a focus on analytical performance and cost considerations. Additionally, some recent sensing approaches that could be automated in the future are included, along with an overview of approaches to monitoring both nutrients. These approaches are compared with standard laboratory methods and also with commercially available sensors for both phosphate and nitrate. Application of nutrient sensors in agriculture is discussed as an example of how sensor networks can provide improvements in decision making.
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
Ghirlanda, G; Lear, J D; Lombardi, A; DeGrado, W F
1998-08-14
A series of synthetic receptors capable of binding to the calmodulin-binding domain of calcineurin (CN393-414) was designed, synthesized and characterized. The design was accomplished by docking CN393-414 against a two-helix receptor, using an idealized three-stranded coiled coil as a starting geometry. The sequence of the receptor was chosen using a side-chain re-packing program, which employed a genetic algorithm to select potential binders from a total of 7.5x10(6) possible sequences. A total of 25 receptors were prepared, representing 13 sequences predicted by the algorithm as well as 12 related sequences that were not predicted. The receptors were characterized by CD spectroscopy, analytical ultracentrifugation, and binding assays. The receptors predicted by the algorithm bound CN393-414 with apparent dissociation constants ranging from 0.2 microM to >50 microM. Many of the receptors that were not predicted by the algorithm also bound to CN393-414. Methods to circumvent this problem and to improve the automated design of functional proteins are discussed. Copyright 1998 Academic Press
Design of CGMP Production of 18F- and 68Ga-Radiopharmaceuticals
Chu, Pei-Chun; Chao, Hao-Yu; Shieh, Wei-Chen; Chen, Chuck C.
2014-01-01
Objective. Radiopharmaceutical production process must adhere to current good manufacturing process (CGMP) compliance to ensure the quality of precursor, prodrug (active pharmaceutical ingredient, API), and the final drug product that meet acceptance criteria. We aimed to develop an automated system for production of CGMP grade of PET radiopharmaceuticals. Methods. The hardware and software of the automated synthesizer that fit in the hot cell under cGMP requirement were developed. Examples of production yield and purity for 68Ga-DOTATATE and 18F-FDG at CGMP facility were optimized. Analytical assays and acceptance criteria for cGMP grade of 68Ga-DOTATATE and 18F-FDG were established. Results. CGMP facility for the production of PET radiopharmaceuticals has been established. Radio-TLC and HPLC analyses of 68Ga-DOTATATE and 18F-FDG showed that the radiochemical purity was 92% and 96%, respectively. The products were sterile and pyrogenic-free. Conclusion. CGMP compliance of radiopharmaceuticals has been reviewed. 68Ga-DOTATATE and 18F-FDG were synthesized with high radiochemical yield under CGMP process. PMID:25276810
Benefits of an automated GLP final report preparation software solution.
Elvebak, Larry E
2011-07-01
The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.
Šrámková, Ivana; Amorim, Célia G; Sklenářová, Hana; Montenegro, Maria C B M; Horstkotte, Burkhard; Araújo, Alberto N; Solich, Petr
2014-01-01
In this work, an application of an enzymatic reaction for the determination of the highly hydrophobic drug propofol in emulsion dosage form is presented. Emulsions represent a complex and therefore challenging matrix for analysis. Ethanol was used for breakage of a lipid emulsion, which enabled optical detection. A fully automated method based on Sequential Injection Analysis was developed, allowing propofol determination without the requirement of tedious sample pre-treatment. The method was based on spectrophotometric detection after the enzymatic oxidation catalysed by horseradish peroxidase and subsequent coupling with 4-aminoantipyrine leading to a coloured product with an absorbance maximum at 485 nm. This procedure was compared with a simple fluorimetric method, which was based on the direct selective fluorescence emission of propofol in ethanol at 347 nm. Both methods provide comparable validation parameters with linear working ranges of 0.005-0.100 mg mL(-1) and 0.004-0.243 mg mL(-1) for the spectrophotometric and fluorimetric methods, respectively. The detection and quantitation limits achieved with the spectrophotometric method were 0.0016 and 0.0053 mg mL(-1), respectively. The fluorimetric method provided the detection limit of 0.0013 mg mL(-1) and limit of quantitation of 0.0043 mg mL(-1). The RSD did not exceed 5% and 2% (n=10), correspondingly. A sample throughput of approx. 14 h(-1) for the spectrophotometric and 68 h(-1) for the fluorimetric detection was achieved. Both methods proved to be suitable for the determination of propofol in pharmaceutical formulation with average recovery values of 98.1 and 98.5%. © 2013 Elsevier B.V. All rights reserved.
HTAPP: High-Throughput Autonomous Proteomic Pipeline
Yu, Kebing; Salomon, Arthur R.
2011-01-01
Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676
Evaluation of mouse red blood cell and platelet counting with an automated hematology analyzer.
Fukuda, Teruko; Asou, Eri; Nogi, Kimiko; Goto, Kazuo
2017-10-07
An evaluation of mouse red blood cell (RBC) and platelet (PLT) counting with an automated hematology analyzer was performed with three strains of mice, C57BL/6 (B6), BALB/c (BALB) and DBA/2 (D2). There were no significant differences in RBC and PLT counts between manual and automated optical methods in any of the samples, except for D2 mice. For D2, RBC counts obtained using the manual method were significantly lower than those obtained using the automated optical method (P<0.05), and PLT counts obtained using the manual method were higher than those obtained using the automated optical method (P<0.05). An automated hematology analyzer can be used for RBC and PLT counting; however, an appropriate method should be selected when D2 mice samples are used.
Meshing of a Spiral Bevel Gearset with 3D Finite Element Analysis
NASA Technical Reports Server (NTRS)
Bibel, George D.; Handschuh, Robert
1996-01-01
Recent advances in spiral bevel gear geometry and finite element technology make it practical to conduct a structural analysis and analytically roll the gearset through mesh. With the advent of user specific programming linked to 3D solid modelers and mesh generators, model generation has become greatly automated. Contact algorithms available in general purpose finite element codes eliminate the need for the use and alignment of gap elements. Once the gearset is placed in mesh, user subroutines attached to the FE code easily roll the gearset through mesh. The method is described in detail. Preliminary results for a gearset segment showing the progression of the contact lineload is given as the gears roll through mesh.
The detection and correction of outlying determinations that may occur during geochemical analysis
Harvey, P.K.
1974-01-01
'Wild', 'rogue' or outlying determinations occur periodically during geochemical analysis. Existing tests in the literature for the detection of such determinations within a set of replicate measurements are often misleading. This account describes the chances of detecting outliers and the extent to which correction may be made for their presence in sample sizes of three to seven replicate measurements. A systematic procedure for monitoring data for outliers is outlined. The problem of outliers becomes more important as instrumental methods of analysis become faster and more highly automated; a state in which it becomes increasingly difficult for the analyst to examine every determination. The recommended procedure is easily adapted to such analytical systems. ?? 1974.
Conflict Probability Estimation for Free Flight
NASA Technical Reports Server (NTRS)
Paielli, Russell A.; Erzberger, Heinz
1996-01-01
The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.
The NIST Quantitative Infrared Database
Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.
1999-01-01
With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
Grand, Maxime M; Chocholouš, Petr; Růžička, Jarda; Solich, Petr; Measures, Christopher I
2016-06-07
By virtue of their compactness, long-term stability, minimal reagent consumption and robustness, miniaturized sequential injection instruments are well suited for automation of assays onboard research ships. However, in order to reach the sensitivity and limit of detection required for open-ocean determinations of trace elements, it is necessary to preconcentrate the analyte prior its derivatization and subsequent detection by fluorescence. In this work, a novel method for the determination of dissolved zinc (Zn) at subnanomolar levels in seawater is described. The proposed method combines, for the first time, automated matrix removal, extraction of the target element, and fluorescence detection within a miniaturized flow manifold, based on the Lab-On-Valve (LOV) concept. The key feature of the microfluidic manipulation of the sample is flow programming, designed to pass sample through a mini-column where the target analyte and other complexable cations are retained, while the seawater matrix is washed out. Next, zinc is eluted and merged with a Zn selective fluorescent probe (FluoZin-3) at the confluence point of the LOV central channel using two high-precision stepper motor driven pumps that are operated in concert. Finally, the thus formed Zn complex is transported to the LOV flow cell for selective fluorescence measurement. This work describes the characterization and optimization of the method including Solid Phase Extraction using the Toyopearl AF-Chelate-650M resin, and detailed assay protocol controlled by a commercially available software and instrument. The proposed method features a LOD of 0.02 nM, high precision (<3% at 0.1 and 2 nM Zn levels), an assay cycle of 13 min and a reagent consumption of 150 μL FluoZin-3 per sample, which makes the method highly suitable for oceanographic shipboard analysis. The accuracy of the method has been validated through the analysis of seawater reference standards and comparison with ICP-MS determinations on seawater samples collected in the upper 1300 m of the subtropical south Indian Ocean. This work confirms that integration of sample pretreatment with optical detection in the LOV format offers a widely applicable approach to trace analysis of seawater. Copyright © 2016. Published by Elsevier B.V.
Human-machine analytics for closed-loop sense-making in time-dominant cyber defense problems
NASA Astrophysics Data System (ADS)
Henry, Matthew H.
2017-05-01
Many defense problems are time-dominant: attacks progress at speeds that outpace human-centric systems designed for monitoring and response. Despite this shortcoming, these well-honed and ostensibly reliable systems pervade most domains, including cyberspace. The argument that often prevails when considering the automation of defense is that while technological systems are suitable for simple, well-defined tasks, only humans possess sufficiently nuanced understanding of problems to act appropriately under complicated circumstances. While this perspective is founded in verifiable truths, it does not account for a middle ground in which human-managed technological capabilities extend well into the territory of complex reasoning, thereby automating more nuanced sense-making and dramatically increasing the speed at which it can be applied. Snort1 and platforms like it enable humans to build, refine, and deploy sense-making tools for network defense. Shortcomings of these platforms include a reliance on rule-based logic, which confounds analyst knowledge of how bad actors behave with the means by which bad behaviors can be detected, and a lack of feedback-informed automation of sensor deployment. We propose an approach in which human-specified computational models hypothesize bad behaviors independent of indicators and then allocate sensors to estimate and forecast the state of an intrusion. State estimates and forecasts inform the proactive deployment of additional sensors and detection logic, thereby closing the sense-making loop. All the while, humans are on the loop, rather than in it, permitting nuanced management of fast-acting automated measurement, detection, and inference engines. This paper motivates and conceptualizes analytics to facilitate this human-machine partnership.
Kaminsky, Jan; Rodt, Thomas; Gharabaghi, Alireza; Forster, Jan; Brand, Gerd; Samii, Madjid
2005-06-01
The FE-modeling of complex anatomical structures is not solved satisfyingly so far. Voxel-based as opposed to contour-based algorithms allow an automated mesh generation based on the image data. Nonetheless their geometric precision is limited. We developed an automated mesh-generator that combines the advantages of voxel-based generation with improved representation of the geometry by displacement of nodes on the object-surface. Models of an artificial 3D-pipe-section and a skullbase were generated with different mesh-densities using the newly developed geometric, unsmoothed and smoothed voxel generators. Compared to the analytic calculation of the 3D-pipe-section model the normalized RMS error of the surface stress was 0.173-0.647 for the unsmoothed voxel models, 0.111-0.616 for the smoothed voxel models with small volume error and 0.126-0.273 for the geometric models. The highest element-energy error as a criterion for the mesh quality was 2.61x10(-2) N mm, 2.46x10(-2) N mm and 1.81x10(-2) N mm for unsmoothed, smoothed and geometric voxel models, respectively. The geometric model of the 3D-skullbase resulted in the lowest element-energy error and volume error. This algorithm also allowed the best representation of anatomical details. The presented geometric mesh-generator is universally applicable and allows an automated and accurate modeling by combining the advantages of the voxel-technique and of improved surface-modeling.
Deep learning for galaxy surface brightness profile fitting
NASA Astrophysics Data System (ADS)
Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.; Domínguez Sánchez, H.; Dimauro, P.
2018-03-01
Numerous ongoing and future large area surveys (e.g. Dark Energy Survey, EUCLID, Large Synoptic Survey Telescope, Wide Field Infrared Survey Telescope) will increase by several orders of magnitude the volume of data that can be exploited for galaxy morphology studies. The full potential of these surveys can be unlocked only with the development of automated, fast, and reliable analysis methods. In this paper, we present DeepLeGATo, a new method for 2-D photometric galaxy profile modelling, based on convolutional neural networks. Our code is trained and validated on analytic profiles (HST/CANDELS F160W filter) and it is able to retrieve the full set of parameters of one-component Sérsic models: total magnitude, effective radius, Sérsic index, and axis ratio. We show detailed comparisons between our code and GALFIT. On simulated data, our method is more accurate than GALFIT and ˜3000 time faster on GPU (˜50 times when running on the same CPU). On real data, DeepLeGATo trained on simulations behaves similarly to GALFIT on isolated galaxies. With a fast domain adaptation step made with the 0.1-0.8 per cent the size of the training set, our code is easily capable to reproduce the results obtained with GALFIT even on crowded regions. DeepLeGATo does not require any human intervention beyond the training step, rendering it much automated than traditional profiling methods. The development of this method for more complex models (two-component galaxies, variable point spread function, dense sky regions) could constitute a fundamental tool in the era of big data in astronomy.
Jarolim, Petr; Patel, Purvish P; Conrad, Michael J; Chang, Lei; Melenovsky, Vojtech; Wilson, David H
2015-10-01
The association between increases in cardiac troponin and adverse cardiac outcomes is well established. There is a growing interest in exploring routine cardiac troponin monitoring as a potential early indicator of adverse heart health trends. Prognostic use of cardiac troponin measurements requires an assay with very high sensitivity and outstanding analytical performance. We report development and preliminary validation of an investigational assay meeting these requirements and demonstrate its applicability to cohorts of healthy individuals and patients with heart failure. On the basis of single molecule array technology, we developed a 45-min immunoassay for cardiac troponin I (cTnI) for use on a novel, fully automated digital analyzer. We characterized its analytical performance and measured cTnI in healthy individuals and heart failure patients in a preliminary study of assay analytical efficacy. The assay exhibited a limit of detection of 0.01 ng/L, a limit of quantification of 0.08 ng/L, and a total CV of 10% at 2.0 ng/L. cTnI concentrations were well above the assay limit of detection for all samples tested, including samples from healthy individuals. cTnI was significantly higher in heart failure patients, and exhibited increasing median and interquartile concentrations with increasing New York Heart Association classification of heart failure severity. The robust 2-log increase in sensitivity relative to contemporary high-sensitivity cardiac troponin immunoassays, combined with full automation, make this assay suitable for exploring cTnI concentrations in cohorts of healthy individuals and for the potential prognostic application of serial cardiac troponin measurements in both apparently healthy and diseased individuals. © 2015 American Association for Clinical Chemistry.
Ibrahim, Sarah A; Martini, Luigi
2014-08-01
Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.
Microreactor Cells for High-Throughput X-ray Absorption Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beesley, Angela; Tsapatsaris, Nikolaos; Weiher, Norbert
2007-01-19
High-throughput experimentation has been applied to X-ray Absorption spectroscopy as a novel route for increasing research productivity in the catalysis community. Suitable instrumentation has been developed for the rapid determination of the local structure in the metal component of precursors for supported catalysts. An automated analytical workflow was implemented that is much faster than traditional individual spectrum analysis. It allows the generation of structural data in quasi-real time. We describe initial results obtained from the automated high throughput (HT) data reduction and analysis of a sample library implemented through the 96 well-plate industrial standard. The results show that a fullymore » automated HT-XAS technology based on existing industry standards is feasible and useful for the rapid elucidation of geometric and electronic structure of materials.« less
Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A
2016-01-01
The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient’s/parent’s smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. PMID:27018263
Simultaneous determination of three anticonvulsants using hydrophilic interaction LC-MS.
Oertel, Reinhard; Arenz, Norman; Pietsch, Jörg; Kirch, Wilhelm
2009-01-01
A specific and automated method was developed to quantify the anticonvulsants gabapentin, pregabalin and vigabatrin simultaneously in human serum. Samples were prepared with a protein precipitation. The hydrophilic interaction chromatography (HILIC) with a mobile phase gradient was used to divide off ions of the matrix and for separation of the analytes. Four different HILIC-columns and two different column temperatures were tested. The Tosoh-Amid column gave the best results: single small peaks. The anticonvulsants were detected in the multiple reaction monitoring mode (MRM) with ESI-MS-MS. Using a volume of 100 microL biological sample the lowest point of the standard curve, i.e. the lower LOQs were 312 ng/mL. The described HILIC-MS-MS method is suitable for therapeutic drug monitoring and for clinical and pharmcokinetical investigations of the anticonvulsives.
How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?
NASA Astrophysics Data System (ADS)
Wachowicz, Monica
2000-04-01
This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario
Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less
Alternative Approaches to Mission Control Automation at NASA's Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Rackley, Michael; Cooter, Miranda; Davis, George; Mackey, Jennifer
2001-01-01
To meet its objective of reducing operations costs without incurring a corresponding increase in risk, NASA is seeking new methods to automate mission operations. This paper examines the state of the art in automating ground operations for space missions. A summary of available technologies and methods for automating mission operations is provided. Responses from interviews with several space mission FOTs (Flight Operations Teams) to assess the degree and success of those technologies and methods implemented are presented. Mission operators that were interviewed approached automation using different tools and methods resulting in varying degrees of success - from nearly completely automated to nearly completely manual. Two key criteria for successful automation are the active participation of the FOT in the planning, designing, testing, and implementation of the system and the relative degree of complexity of the mission.
Abbatiello, Susan E.; Mani, D. R.; Keshishian, Hasmik; Carr, Steven A.
2010-01-01
BACKGROUND Multiple reaction monitoring mass spectrometry (MRM-MS) of peptides with stable isotope–labeled internal standards (SISs) is increasingly being used to develop quantitative assays for proteins in complex biological matrices. These assays can be highly precise and quantitative, but the frequent occurrence of interferences requires that MRM-MS data be manually reviewed, a time-intensive process subject to human error. We developed an algorithm that identifies inaccurate transition data based on the presence of interfering signal or inconsistent recovery among replicate samples. METHODS The algorithm objectively evaluates MRM-MS data with 2 orthogonal approaches. First, it compares the relative product ion intensities of the analyte peptide to those of the SIS peptide and uses a t-test to determine if they are significantly different. A CV is then calculated from the ratio of the analyte peak area to the SIS peak area from the sample replicates. RESULTS The algorithm identified problematic transitions and achieved accuracies of 94%–100%, with a sensitivity and specificity of 83%–100% for correct identification of errant transitions. The algorithm was robust when challenged with multiple types of interferences and problematic transitions. CONCLUSIONS This algorithm for automated detection of inaccurate and imprecise transitions (AuDIT) in MRM-MS data reduces the time required for manual and subjective inspection of data, improves the overall accuracy of data analysis, and is easily implemented into the standard data-analysis work flow. AuDIT currently works with results exported from MRM-MS data-processing software packages and may be implemented as an analysis tool within such software. PMID:20022980
Fabbretti, G
2010-06-01
Because of its complex nature, surgical pathology practice is prone to error. In this report, we describe our methods for reducing error as much as possible during the pre-analytical and analytical phases. This was achieved by revising procedures, and by using computer technology and automation. Most mistakes are the result of human error in the identification and matching of patient and samples. To avoid faulty data interpretation, we employed a new comprehensive computer system that acquires all patient ID information directly from the hospital's database with a remote order entry; it also provides label and request forms via-Web where clinical information is required before sending the sample. Both patient and sample are identified directly and immediately at the site where the surgical procedures are performed. Barcode technology is used to input information at every step and automation is used for sample blocks and slides to avoid errors that occur when information is recorded or transferred by hand. Quality control checks occur at every step of the process to ensure that none of the steps are left to chance and that no phase is dependent on a single operator. The system also provides statistical analysis of errors so that new strategies can be implemented to avoid repetition. In addition, the staff receives frequent training on avoiding errors and new developments. The results have been shown promising results with a very low error rate (0.27%). None of these compromised patient health and all errors were detected before the release of the diagnosis report.
Automation of a spectrophotometric method for measuring L -carnitine in human blood serum.
Galan, A; Padros, A; Arambarri, M; Martin, S
1998-01-01
A spectrometric method for the determination of L-carnitine has been developed based on the reaction of the 5,5' dithiobis-(2-nitrobenzoic) acid (DTNB) and adapted to a Technicon RA-2000 automatic analyser Química Farmacéutica Bayer, S.A.). The detection limit of the method is 13.2 mumol/l, with a measurement interval ranging from 30 to 320 mumoll1. Imprecision and accuracy are good even at levels close to the detection limit (coeffcient of variation of 5.4% for within-run imprecision for a concentration of 35 mumol/l). A good correlation was observed between the method studied and the radiometric method. The method evaluated has suffcient analytical sensitivity to diagnose carnitine deficiencies. The short time period required for sample processing (30 samples in 40min), the simple methodology and apparatus, the ease of personnel training and the low cost of the reagents make this method a good alternative to the classical radiometric method for evaluating serum L-carnitine in clinical laboratories without radioactive installations.
Automation of a spectrophotometric method for measuring L -carnitine in human blood serum
Galan, Amparo; Padros, Anna; Arambarri, Marta; Martin, Silvia
1998-01-01
A spectrometric method for the determination of L-carnitine has been developed based on the reaction of the 5, 5 ′ dithiobis-(2-nitrobenzoic) acid (DTNB) and adapted to a Technicon RA-2000 automatic analyser Química Farmacéutica Bayer, S.A.). The detection limit of the method is 13.2 μmol/l, with a measurement interval ranging from 30 to 320 μmoll1. Imprecision and accuracy are good even at levels close to the detection limit (coeffcient of variation of 5.4% for within-run imprecision for a concentration of 35 μmol/l). A good correlation was observed between the method studied and the radiometric method. The method evaluated has suffcient analytical sensitivity to diagnose carnitine deficiencies. The short time period required for sample processing (30 samples in 40min), the simple methodology and apparatus, the ease of personnel training and the low cost of the reagents make this method a good alternative to the classical radiometric method for evaluating serum L-carnitine in clinical laboratories without radioactive installations. PMID:18924818
Teaching Electronics and Laboratory Automation Using Microcontroller Boards
ERIC Educational Resources Information Center
Mabbott, Gary A.
2014-01-01
Modern microcontroller boards offer the analytical chemist a powerful and inexpensive means of interfacing computers and laboratory equipment. The availability of a host of educational materials, compatible sensors, and electromechanical devices make learning to implement microcontrollers fun and empowering. This article describes the advantages…
NASA Astrophysics Data System (ADS)
Gerontas, Apostolos
2014-08-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent chromatographic instrumentation had to fulfill. Networks of specialists, groups of actors, corporate strategies and the analytical practice itself, were all affected and in many ways because of the entrance of gas chromatography in the chemical laboratory and in the instrumentation market. This paper gives a view of the early history of the gas chromatography instrumentation, relates it to the broader research-technology phenomenon and discusses issues of education and group reproduction in the case of the groups of technologists of the era. The chaotic elements of knowledge transfer during the instrumentation revolution in chemistry are being highlighted and they are being connected to the observable radical innovation of the period.
Toward best practice: leveraging the electronic patient record as a clinical data warehouse.
Ledbetter, C S; Morgan, M W
2001-01-01
Automating clinical and administrative processes via an electronic patient record (EPR) gives clinicians the point-of-care tools they need to deliver better patient care. However, to improve clinical practice as a whole and then evaluate it, healthcare must go beyond basic automation and convert EPR data into aggregated, multidimensional information. Unfortunately, few EPR systems have the established, powerful analytical clinical data warehouses (CDWs) required for this conversion. This article describes how an organization can support best practice by leveraging a CDW that is fully integrated into its EPR and clinical decision support (CDS) system. The article (1) discusses the requirements for comprehensive CDS, including on-line analytical processing (OLAP) of data at both transactional and aggregate levels, (2) suggests that the transactional data acquired by an OLTP EPR system must be remodeled to support retrospective, population-based, aggregate analysis of those data, and (3) concludes that this aggregate analysis is best provided by a separate CDW system.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Rubino, F M; Floridia, L; Pietropaolo, A M; Tavazzani, M; Colombi, A
1999-01-01
Within the context of continuing interest in occupational hygiene of hospitals as workplaces, the authors report the results of a preliminary study on surface contamination by certain antineoplastic drugs (ANDs), recently performed in eight cancer departments of two large general hospitals in Milan, Italy. Since reliable quantitative information on the exposure levels to individual drugs is mandatory to establish a strong interpretative framework for correctly assessing the health risks associated with manipulation of ANDs and rationally advise intervention priorities for exposure abatement, two automated analytical methods were set up using reverse-phase high-performance liquid chromatography for the measurement of contamination by 1) methotrexate (MTX) and 2) the three most important nucleoside analogue antineoplastic drugs (5-fluorouracil 5FU, Cytarabin CYA, Gemcytabin GCA) on surfaces such as those of preparation hoods and work-benches in the pharmacies of cancer wards. The methods are characterized by short analysis time (7 min) under isocratic conditions, by the use of a mobile phase with a minimal content of organic solvent and by high sensitivity, adequate to detect surface contamination in the 5-10 micrograms/m2 range. To exemplify the performance of the analytical methods in the assessment of contamination levels from the target analyte ANDs, data are reported on the contamination levels measured on various surfaces (such as on handles, floor surfaces and window panes, even far from the preparation hood). Analyte concentrations corresponding to 0.8-1.5 micrograms of 5FU were measured on telephones, 0.85-28 micrograms/m2 of CYA were measured on tables, 1.2-1150 micrograms/m2 of GCA on furniture and floors. Spillage fractions between 1-5% of the used ANDs (daily use 5FU 7-13 g; CYA 0.1-7.1 g; GCA 0.2-5 g) were measured on the disposable polythene-backed paper cover sheet of the preparation hood.
González, Natalia; Grünhut, Marcos; Šrámková, Ivana; Lista, Adriana G; Horstkotte, Burkhard; Solich, Petr; Sklenářová, Hana; Acebal, Carolina C
2018-02-01
A fully automated spectrophotometric method based on flow-batch analysis has been developed for the determination of clenbuterol including an on-line solid phase extraction using a molecularly imprinted polymer (MIP) as the sorbent. The molecularly imprinted solid phase extraction (MISPE) procedure allowed analyte extraction from complex matrices at low concentration levels and with high selectivity towards the analyte. The MISPE procedure was performed using a commercial MIP cartridge that was introduced into a guard column holder and integrated in the analyzer system. Optimized parameters included the volume of the sample, the type and volume of the conditioning and washing solutions, and the type and volume of the eluent. Quantification of clenbuterol was carried out by spectrophotometry after in-system post-elution analyte derivatization based on azo-coupling using N- (1-Naphthyl) ethylenediamine as the coupling agent to yield a red-colored compound with maximum absorbance at 500nm. Both the chromogenic reaction and spectrophotometric detection were performed in a lab-made flow-batch mixing chamber that replaced the cuvette holder of the spectrophotometer. The calibration curve was linear in the 0.075-0.500mgL -1 range with a correlation coefficient of 0.998. The precision of the proposed method was evaluated in terms of the relative standard deviation obtaining 1.1% and 3.0% for intra-day precision and inter-day precision, respectively. The detection limit was 0.021mgL -1 and the sample throughput for the entire process was 3.4h -1 . The proposed method was applied for the determination of CLB in human urine and milk substitute samples obtaining recoveries values within a range of 94.0-100.0%. Copyright © 2017 Elsevier B.V. All rights reserved.
Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L
2017-08-07
To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.
2011-02-18
Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant linksmore » between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.« less
Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos
2018-06-01
To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.
Microfluidic-Based sample chips for radioactive solutions
Tripp, J. L.; Law, J. D.; Smith, T. E.; ...
2015-01-01
Historical nuclear fuel cycle process sampling techniques required sample volumes ranging in the tens of milliliters. The radiation levels experienced by analytical personnel and equipment, in addition to the waste volumes generated from analysis of these samples, have been significant. These sample volumes also impacted accountability inventories of required analytes during process operations. To mitigate radiation dose and other issues associated with the historically larger sample volumes, a microcapillary sample chip was chosen for further investigation. The ability to obtain microliter volume samples coupled with a remote automated means of sample loading, tracking, and transporting to the analytical instrument wouldmore » greatly improve analytical efficiency while reducing both personnel exposure and radioactive waste volumes. Sample chip testing was completed to determine the accuracy, repeatability, and issues associated with the use of microfluidic sample chips used to supply µL sample volumes of lanthanide analytes dissolved in nitric acid for introduction to an analytical instrument for elemental analysis.« less
Verplaetse, Ruth; Henion, Jack
2016-07-05
A workflow overcoming microsample collection issues and hematocrit (HCT)-related bias would facilitate more widespread use of dried blood spots (DBS). This report describes comparative results between the use of a pipet and a microfluidic-based sampling device for the creation of volumetric DBS. Both approaches were successfully coupled to HCT-independent, fully automated sample preparation and online liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis allowing detection of five stimulants in finger prick blood. Reproducible, selective, accurate, and precise responses meeting generally accepted regulated bioanalysis guidelines were observed over the range of 5-1000 ng/mL whole blood. The applied heated flow-through solvent desorption of the entire spot and online solid phase extraction (SPE) procedure were unaffected by the blood's HCT value within the tested range of 28.0-61.5% HCT. Enhanced stability for mephedrone on DBS compared to liquid whole blood was observed. Finger prick blood samples were collected using both volumetric sampling approaches over a time course of 25 h after intake of a single oral dose of phentermine. A pharmacokinetic curve for the incurred phentermine was successfully produced using the described validated method. These results suggest that either volumetric sample collection method may be amenable to field-use followed by fully automated, HCT-independent DBS-SPE-LC-MS/MS bioanalysis for the quantitation of these representative controlled substances. Analytical data from DBS prepared with a pipet and microfluidic-based sampling devices were comparable, but the latter is easier to operate, making this approach more suitable for sample collection by unskilled persons.
Taylor, R Andrew; Pare, Joseph R; Venkatesh, Arjun K; Mowafi, Hani; Melnick, Edward R; Fleischman, William; Hall, M Kennedy
2016-03-01
Predictive analytics in emergency care has mostly been limited to the use of clinical decision rules (CDRs) in the form of simple heuristics and scoring systems. In the development of CDRs, limitations in analytic methods and concerns with usability have generally constrained models to a preselected small set of variables judged to be clinically relevant and to rules that are easily calculated. Furthermore, CDRs frequently suffer from questions of generalizability, take years to develop, and lack the ability to be updated as new information becomes available. Newer analytic and machine learning techniques capable of harnessing the large number of variables that are already available through electronic health records (EHRs) may better predict patient outcomes and facilitate automation and deployment within clinical decision support systems. In this proof-of-concept study, a local, big data-driven, machine learning approach is compared to existing CDRs and traditional analytic methods using the prediction of sepsis in-hospital mortality as the use case. This was a retrospective study of adult ED visits admitted to the hospital meeting criteria for sepsis from October 2013 to October 2014. Sepsis was defined as meeting criteria for systemic inflammatory response syndrome with an infectious admitting diagnosis in the ED. ED visits were randomly partitioned into an 80%/20% split for training and validation. A random forest model (machine learning approach) was constructed using over 500 clinical variables from data available within the EHRs of four hospitals to predict in-hospital mortality. The machine learning prediction model was then compared to a classification and regression tree (CART) model, logistic regression model, and previously developed prediction tools on the validation data set using area under the receiver operating characteristic curve (AUC) and chi-square statistics. There were 5,278 visits among 4,676 unique patients who met criteria for sepsis. Of the 4,222 patients in the training group, 210 (5.0%) died during hospitalization, and of the 1,056 patients in the validation group, 50 (4.7%) died during hospitalization. The AUCs with 95% confidence intervals (CIs) for the different models were as follows: random forest model, 0.86 (95% CI = 0.82 to 0.90); CART model, 0.69 (95% CI = 0.62 to 0.77); logistic regression model, 0.76 (95% CI = 0.69 to 0.82); CURB-65, 0.73 (95% CI = 0.67 to 0.80); MEDS, 0.71 (95% CI = 0.63 to 0.77); and mREMS, 0.72 (95% CI = 0.65 to 0.79). The random forest model AUC was statistically different from all other models (p ≤ 0.003 for all comparisons). In this proof-of-concept study, a local big data-driven, machine learning approach outperformed existing CDRs as well as traditional analytic techniques for predicting in-hospital mortality of ED patients with sepsis. Future research should prospectively evaluate the effectiveness of this approach and whether it translates into improved clinical outcomes for high-risk sepsis patients. The methods developed serve as an example of a new model for predictive analytics in emergency care that can be automated, applied to other clinical outcomes of interest, and deployed in EHRs to enable locally relevant clinical predictions. © 2015 by the Society for Academic Emergency Medicine.
Siontorou, Christina G.
1997-01-01
This paper describes the results of analytical applications of electrochemical biosensors based on bilayer lipid membranes (BLMs) for the automated rapid and sensitive flow monitoring of substrates of hydrolytic enzymes, antigens and triazine herbicides. BLMs, composed of mixtures of egg phosphatidylcholine (egg PC) and dipalmitoylphosphatidic acid (DPPA), were supported on ultrafiltration membranes (glass microfibre or polycarbonate filters) which were found to enhance their stability for flow experiments. The proteins (enzymes, antibodies) were incorporated into a floating lipid matrix at an air-electrolyte interface, and then a casting procedure was used to deliver the lipid onto the filter supports for BLM formation. Injections of the analyte were made into flowing streams of the carrier electrolyte solution and a current transient signal was obtained with a magnitude related to the analyte concentration. Substrates of hydrolytic enzyme reactions (acetylcholine, urea and penicillin) could be determined at the micromolar level with a maximum rate of 220 samples/h, whereas antigens (thyroxin) and triazine herbicides (simazine, atrazine and propazine) could be monitored at the nanomolar level in less than 2 min. The time of appearance of the transient response obtained for herbicides was increased to the order of simazine, atrazine and propazine which has permitted analysis of these triazines in mixtures. PMID:18924789
Lab-on-a-Chip Device for Rapid Measurement of Vitamin D Levels.
Peter, Harald; Bistolas, Nikitas; Schumacher, Soeren; Laurisch, Cecilia; Guest, Paul C; Höller, Ulrich; Bier, Frank F
2018-01-01
Lab-on-a-chip assays allow rapid analysis of one or more molecular analytes on an automated user-friendly platform. Here we describe a fully automated assay and readout for measurement of vitamin D levels in less than 15 min using the Fraunhofer in vitro diagnostics platform. Vitamin D (25-hydroxyvitamin D 3 [25(OH)D 3 ]) dilution series in buffer were successfully tested down to 2 ng/mL. This could be applied in the future as an inexpensive point-of-care analysis for patients suffering from a variety of conditions marked by vitamin D deficiencies.
Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B
2014-11-01
To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.
TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palaniswaamy, G; Morrow, A; Kim, S
Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less
Patient Health Record Systems Scope and Functionalities: Literature Review and Future Directions
2017-01-01
Background A new generation of user-centric information systems is emerging in health care as patient health record (PHR) systems. These systems create a platform supporting the new vision of health services that empowers patients and enables patient-provider communication, with the goal of improving health outcomes and reducing costs. This evolution has generated new sets of data and capabilities, providing opportunities and challenges at the user, system, and industry levels. Objective The objective of our study was to assess PHR data types and functionalities through a review of the literature to inform the health care informatics community, and to provide recommendations for PHR design, research, and practice. Methods We conducted a review of the literature to assess PHR data types and functionalities. We searched PubMed, Embase, and MEDLINE databases from 1966 to 2015 for studies of PHRs, resulting in 1822 articles, from which we selected a total of 106 articles for a detailed review of PHR data content. Results We present several key findings related to the scope and functionalities in PHR systems. We also present a functional taxonomy and chronological analysis of PHR data types and functionalities, to improve understanding and provide insights for future directions. Functional taxonomy analysis of the extracted data revealed the presence of new PHR data sources such as tracking devices and data types such as time-series data. Chronological data analysis showed an evolution of PHR system functionalities over time, from simple data access to data modification and, more recently, automated assessment, prediction, and recommendation. Conclusions Efforts are needed to improve (1) PHR data quality through patient-centered user interface design and standardized patient-generated data guidelines, (2) data integrity through consolidation of various types and sources, (3) PHR functionality through application of new data analytics methods, and (4) metrics to evaluate clinical outcomes associated with automated PHR system use, and costs associated with PHR data storage and analytics. PMID:29141839
Managing Multi-center Flow Cytometry Data for Immune Monitoring
White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn
2014-01-01
With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786
ERIC Educational Resources Information Center
Economou, A.; Papargyris, D.; Stratis, J.
2004-01-01
The development of an FI analyzer for chemiluminescence detection using a low-cost photoiodide is presented. The experiment clearly demonstrates in a single interdisciplinary project the way in which different aspects in chemical instrumentation fit together to produce a working analytical system.
Robandt, Paul P; Reda, Louis J; Klette, Kevin L
2008-10-01
A fully automated system utilizing a liquid handler and an online solid-phase extraction (SPE) device coupled with liquid chromatography-tandem mass spectrometry (LC-MS-MS) was designed to process, detect, and quantify benzoylecgonine (BZE), meta-hydroxybenzoylecgonine (m-OH BZE), para-hydroxybenzoylecgonine (p-OH BZE), and norbenzoylecgonine (nor-BZE) metabolites in human urine. The method was linear for BZE, m-OH BZE, and p-OH BZE from 1.2 to 10,000 ng/mL with limits of detection (LOD) and quantification (LOQ) of 1.2 ng/mL. Nor-BZE was linear from 5 to 10,000 ng/mL with an LOD and LOQ of 1.2 and 5 ng/mL, respectively. The intrarun precision measured as the coefficient of variation of 10 replicates of a 100 ng/mL control was less than 2.6%, and the interrun precision for 5 replicates of the same control across 8 batches was less than 4.8% for all analytes. No assay interference was noted from controls containing cocaine, cocaethylene, and ecgonine methyl ester. Excellent data concordance (R2 > 0.994) was found for direct comparison of the automated SPE-LC-MS-MS procedure and an existing gas chromatography-MS procedure using 94 human urine samples previously determined to be positive for BZE. The automated specimen handling and SPE procedure, when compared to the traditional extraction schema, eliminates the human factors of specimen handling, processing, extraction, and derivatization, thereby reducing labor costs and rework resulting from batch handling issues, and may reduce the number of fume hoods required in the laboratory.
Evaluation of a CLEIA automated assay system for the detection of a panel of tumor markers.
Falzarano, Renato; Viggiani, Valentina; Michienzi, Simona; Longo, Flavia; Tudini, Silvestra; Frati, Luigi; Anastasi, Emanuela
2013-10-01
Tumor markers are commonly used to detect a relapse of disease in oncologic patients during follow-up. It is important to evaluate new assay systems for a better and more precise assessment, as a standardized method is currently lacking. The aim of this study was to assess the concordance between an automated chemiluminescent enzyme immunoassay system (LUMIPULSE® G1200) and our reference methods using seven tumor markers. Serum samples from 787 subjects representing a variety of diagnoses, including oncologic, were analyzed using LUMIPULSE® G1200 and our reference methods. Serum values were measured for the following analytes: prostate-specific antigen (PSA), alpha-fetoprotein (AFP), carcinoembryonic antigen (CEA), cancer antigen 125 (CA125), carbohydrate antigen 15-3 (CA15-3), carbohydrate antigen 19-9 (CA19-9), and cytokeratin 19 fragment (CYFRA 21-1). For the determination of CEA, AFP, and PSA, an automatic analyzer based on chemiluminescence was applied as reference method. To assess CYFRA 21-1, CA125, CA19-9, and CA15-3, an immunoradiometric manual system was employed. Method comparison by Passing-Bablok analysis resulted in slopes ranging from 0.9728 to 1.9089 and correlation coefficients from 0.9977 to 0.9335. The precision of each assay was assessed by testing six serum samples. Each sample was analyzed for all tumor biomarkers in duplicate and in three different runs. The coefficients of variation were less than 6.3 and 6.2 % for within-run and between-run variation, respectively. Our data suggest an overall good interassay agreement for all markers. The comparison with our reference methods showed good precision and reliability, highlighting its usefulness in clinical laboratory's routine.
Rothenhöfer, Martin; Scherübl, Rosmarie; Bernhardt, Günther; Heilmann, Jörg; Buschauer, Armin
2012-07-27
Purified oligomers of hyalobiuronic acid are indispensable tools to elucidate the physiological and pathophysiological role of hyaluronan degradation by various hyaluronidase isoenzymes. Therefore, we established and validated a novel sensitive, convenient, rapid, and cost-effective high performance thin layer chromatography (HPTLC) method for the qualitative and quantitative analysis of small saturated hyaluronan oligosaccharides consisting of 2-4 hyalobiuronic acid moieties. The use of amino-modified silica as stationary phase allows a simple reagent-free in situ derivatization by heating, resulting in a very low limit of detection (7-19 pmol per band, depending on the analyzed saturated oligosaccharide). By this derivatization procedure for the first time densitometric quantification of the analytes could be performed by HPTLC. The validated method showed a quantification limit of 37-71 pmol per band and was proven to be superior in comparison to conventional detection of hyaluronan oligosaccharides. The analytes were identified by hyphenation of normal phase planar chromatography to mass spectrometry (TLC-MS) using electrospray ionization. As an alternative to sequential techniques such as high performance liquid chromatography (HPLC) and capillary electrophoresis (CE), the validated HPTLC quantification method can easily be automated and is applicable to the analysis of multiple samples in parallel. Copyright © 2012 Elsevier B.V. All rights reserved.
Determination of technetium-99 in environmental samples: a review.
Shi, Keliang; Hou, Xiaolin; Roos, Per; Wu, Wangsuo
2012-01-04
Due to the lack of a stable technetium isotope, and the high mobility and long half-life, (99)Tc is considered to be one of the most important radionuclides in safety assessment of environmental radioactivity as well as nuclear waste management. (99)Tc is also an important tracer for oceanographic research due to the high technetium solubility in seawater as TcO(4)(-). A number of analytical methods, using chemical separation combined with radiometric and mass spectrometric measurement techniques, have been developed over the past decades for determination of (99)Tc in different environmental samples. This article summarizes and compares recently reported chemical separation procedures and measurement methods for determination of (99)Tc. Due to the extremely low concentration of (99)Tc in environmental samples, the sample preparation, pre-concentration, chemical separation and purification for removal of the interferences for detection of (99)Tc are the most important issues governing the accurate determination of (99)Tc. These aspects are discussed in detail in this article. Meanwhile, the different measurement techniques for (99)Tc are also compared with respect to advantages and drawbacks. Novel automated analytical methods for rapid determination of (99)Tc using solid extraction or ion exchange chromatography for separation of (99)Tc, employing flow injection or sequential injection approaches are also discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
López-Bascón, María Asunción; Calderón-Santiago, Mónica; Priego-Capote, Feliciano
2016-11-02
A novel class of endogenous mammalian lipids endowed with antidiabetic and anti-inflammatory properties has been recently discovered. These are fatty acid esters of hydroxy fatty acids (FAHFAs) formed by condensation between a hydroxy fatty acid and a fatty acid. FAHFAs are present in human serum and tissues at low nanomolar concentrations. Therefore, high sensitivity and selectivity profiling analysis of these compounds in clinical samples is demanded. An automated qualitative and quantitative method based on on-line coupling between solid phase extraction and liquid chromatography-tandem mass spectrometry has been here developed for determination of FAHFAs in serum with the required sensitivity and selectivity. Matrix effects were evaluated by preparation of calibration models in serum and methanol. Recovery factors ranged between 73.8 and 100% in serum. The within-day variability ranged from 7.1 to 13.8%, and the between-days variability varied from 9.3 to 21.6%, which are quite acceptable values taking into account the low concentration levels at which the target analytes are found. The method has been applied to a cohort of human serum samples to estimate the concentrations profiles as a function of the glycaemic state and obesity. Statistical analysis revealed three FAHFAs with levels significantly different depending on the glycaemic state or the body mass index. This automated method could be implemented in high-throughput analysis with minimum user assistance. Copyright © 2016 Elsevier B.V. All rights reserved.
Ivanov, Iliya V; Leitritz, Martin A; Norrenberg, Lars A; Völker, Michael; Dynowski, Marek; Ueffing, Marius; Dietter, Johannes
2016-02-01
Abnormalities of blood vessel anatomy, morphology, and ratio can serve as important diagnostic markers for retinal diseases such as AMD or diabetic retinopathy. Large cohort studies demand automated and quantitative image analysis of vascular abnormalities. Therefore, we developed an analytical software tool to enable automated standardized classification of blood vessels supporting clinical reading. A dataset of 61 images was collected from a total of 33 women and 8 men with a median age of 38 years. The pupils were not dilated, and images were taken after dark adaption. In contrast to current methods in which classification is based on vessel profile intensity averages, and similar to human vision, local color contrast was chosen as a discriminator to allow artery vein discrimination and arterial-venous ratio (AVR) calculation without vessel tracking. With 83% ± 1 standard error of the mean for our dataset, we achieved best classification for weighted lightness information from a combination of the red, green, and blue channels. Tested on an independent dataset, our method reached 89% correct classification, which, when benchmarked against conventional ophthalmologic classification, shows significantly improved classification scores. Our study demonstrates that vessel classification based on local color contrast can cope with inter- or intraimage lightness variability and allows consistent AVR calculation. We offer an open-source implementation of this method upon request, which can be integrated into existing tool sets and applied to general diagnostic exams.
CLEIA CA125 evidences: good analytical performance avoiding "Hook effect".
Falzarano, R; Viggiani, V; Michienzi, S; Colaprisca, B; Longo, F; Frati, L; Anastasi, E
2013-02-01
Cancer antigen 125 (CA125) is a coelomic epithelium-related antigen carried by a high molecular weight glycoprotein complex. It is commonly used as a tumor marker for ovarian cancer to monitor disease progression and response to therapy and as an early detection for recurrence after treatment. The aim of this study was to test the reliability of two different assay methods, a radioimmunometric assay (RIA) and an automated chemiluminescent enzyme immunoassay (CLEIA) system, by measuring CA125 serum levels using both methods in 357 patients and comparing the results. Patients were recruited from Oncologic Unit A, Policlinico Umberto I, Roma. Eighty-six were healthy donors, while 271 were oncologic patients representing a variety of diagnoses. Within this group, 76 patients were diagnosed with an ovarian related pathology (28 cancerous and 48 benign). The evaluation of CA125 marker blood levels showed a high agreement in healthy donors group (R (2) = 0.9003). Interesting results emerged when sera collected from oncologic patients were assessed: significant differences between the two assays were found in nine samples. When assayed again with RIA after a dilution, new values agreed with undiluted CLEIA values (R (2) = 0.9847). Our data suggest an overall good comparison between the two methods. However, some artifacts were obtained with RIA and indicate an underlying presence of "hook effect". CLEIA automated assay showed a good reliability and should be preferred to one-step radioimmunoassays in order to minimize errors.
van der Slegt, Jasper; Verbogt, Nathalie Pa; Mulder, Paul Gh; Steunenberg, Stijn L; Steunenberg, Bastiaan E; van der Laan, Lijckle
2016-10-01
An automated ankle-brachial index device could lead to potential time savings and more accuracy in ankle-brachial index-determination after vascular surgery. This prospective cross-sectional study compared postprocedural ankle-brachial indices measured by a manual method with ankle-brachial indices of an automated plethysmographic method. Forty-two patients were included. No significant difference in time performing a measurement was observed (1.1 min, 95% CI: -0.2 to +2.4; P = 0.095). Mean ankle-brachial index with the automated method was 0.105 higher (95% CI: 0.017 to 0.193; P = 0.020) than with the manual method, with limits of agreement of -0.376 and +0.587. Total variance amounted to 0.0759 and the correlation between both methods was 0.60. Reliability expressed as maximum absolute difference (95% level) between duplicate ankle-brachial index-measurements under identical conditions was 0.350 (manual) and 0.152 (automated), although not significant (p = 0.053). Finally, the automated method had 34% points higher failure rate than the manual method. In conclusion based on this study, the automated ankle-brachial index-method seems not to be clinically applicable for measuring ankle-brachial index postoperatively in patients with vascular disease. © The Author(s) 2016.
Xie, Rui; Wen, Jun; Wei, Hua; Fan, Guorong; Zhang, Dabing
2010-05-01
An automated system using on-line solid-phase extraction and HPLC with UV detection was developed for the determination of faropenem in human plasma and urine. Analytical process was performed isocratically with two reversed-phase columns connected by a switching valve. After simple pretreatment for plasma and urine with acetonitrile, a volume of 100microl upper layer of the plasma or urine samples was injected for on-line SPE column switching HPLC-UV analysis. The analytes were retained on the self-made trap column (Lichrospher C(18), 4.6mmx37mm, 25microm) with the loading solvent (20mM NaH(2)PO(4) adjusted pH 3.5) at flow rate of 2mlmin(-1), and most matrix materials were removed from the column to waste. After 0.5min washing, the valve was switched to another position so that the target analytes could be eluted from trap column to analytical column in the back-flush mode by the mobile phase (acetonitrile-20mM NaH(2)PO(4) adjusted pH 3.5, 16:84, v/v) at flow rate of 1.5mlmin(-1), and then separated on the analytical column (Ultimate XB-C(18), 4.6mmx50mm, 5microm).The complete cycle of the on-line SPE preconcentration purification and HPLC separation of the analytes was 5min. Calibration curves with good linearities (r=0.9994 for plasma sample and r=0.9988 for urine sample) were obtained in the range 0.02-5microgml(-1) in plasma and 0.05-10microg ml(-1) in urine for faropenem. The optimized method showed good performance in terms of specificity, linearity, detection and quantification limits, precision and accuracy. The method was successfully utilized to quantify faropenem in human plasma and urine to support the clinical pharmacokinetic studies. Copyright 2009 Elsevier B.V. All rights reserved.
Ferreira, Vicente; Herrero, Paula; Zapata, Julián; Escudero, Ana
2015-08-14
SPME is extremely sensitive to experimental parameters affecting liquid-gas and gas-solid distribution coefficients. Our aims were to measure the weights of these factors and to design a multivariate strategy based on the addition of a pool of internal standards, to minimize matrix effects. Synthetic but real-like wines containing selected analytes and variable amounts of ethanol, non-volatile constituents and major volatile compounds were prepared following a factorial design. The ANOVA study revealed that even using a strong matrix dilution, matrix effects are important and additive with non-significant interaction effects and that it is the presence of major volatile constituents the most dominant factor. A single internal standard provided a robust calibration for 15 out of 47 analytes. Then, two different multivariate calibration strategies based on Partial Least Square Regression were run in order to build calibration functions based on 13 different internal standards able to cope with matrix effects. The first one is based in the calculation of Multivariate Internal Standards (MIS), linear combinations of the normalized signals of the 13 internal standards, which provide the expected area of a given unit of analyte present in each sample. The second strategy is a direct calibration relating concentration to the 13 relative areas measured in each sample for each analyte. Overall, 47 different compounds can be reliably quantified in a single fully automated method with overall uncertainties better than 15%. Copyright © 2015 Elsevier B.V. All rights reserved.
Jiang, Xiaogang; Feng, Shun; Tian, Ruijun; Han, Guanghui; Jiang, Xinning; Ye, Mingliang; Zou, Hanfa
2007-02-01
An approach was developed to automate sample introduction for nanoflow LC-MS/MS (microLC-MS/MS) analysis using a strong cation exchange (SCX) trap column. The system consisted of a 100 microm id x 2 cm SCX trap column and a 75 microm id x 12 cm C18 RP analytical column. During the sample loading step, the flow passing through the SCX trap column was directed to waste for loading a large volume of sample at high flow rate. Then the peptides bound on the SCX trap column were eluted onto the RP analytical column by a high salt buffer followed by RP chromatographic separation of the peptides at nanoliter flow rate. It was observed that higher performance of separation could be achieved with the system using SCX trap column than with the system using C18 trap column. The high proteomic coverage using this approach was demonstrated in the analysis of tryptic digest of BSA and yeast cell lysate. In addition, this system was also applied to two-dimensional separation of tryptic digest of human hepatocellular carcinoma cell line SMMC-7721 for large scale proteome analysis. This system was fully automated and required minimum changes on current microLC-MS/MS system. This system represented a promising platform for routine proteome analysis.
Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini
2018-08-01
Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.
Parra, Marina; Foj, Laura; Filella, Xavier
2016-07-01
Because of its potential value in several pathologies, clinical interest in 25-hydroxy Vitamin D (25OH-D) is increasing. However, the standardisation of assays remains a significant problem. Our aim was to evaluate the performance of the novel Lumipulse G 25-OH Vitamin D assay (Fujirebio), comparing results with the Liaison (Diasorin) method. Analytical verification of the Lumipulse G 25-OH Vitamin D assay was performed. Both methods were compared using sera from 226 patients, including 111 patients with chronic renal failure (39 on haemodialysis) and 115 patients without renal failure. In addition, clinical concordance between assays was assessed. For Lumipulse G 25-OH Vitamin D assay, the limit of detection was 0.3 ng/mL, and the limit of quantification was 2.5 ng/mL with a 9.7% of coefficient of variation. Intra-and inter-assay coefficients of variation were <2.3 and <1.8% (25.4-50.0 ng/mL), respectively. Dilution linearity was in the range of 4.5-144.5 ng/mL. Method comparison resulted in a mean difference of -6.5% (95% CI from -8.8 to -4.1) for all samples between Liaison and Lumipulse G. Clinical concordance assessed by Kappa Index was 0.66. Lumipulse G 25-OH Vitamin D showed a good clinical concordance with the Liaison assay, although overall results measured in Lumipulse were higher by an average of 6.5%.
Cao, Jianmin; Sun, Na; Yu, Weisong; Pang, Xueli; Lin, Yingnan; Kong, Fanyu; Qiu, Jun
2016-12-01
A sensitive and robust multiresidue method for the simultaneous analysis of 114 pesticides in tobacco was developed based on solid-phase extraction coupled with gas chromatography and tandem mass spectrometry. In this strategy, tobacco samples were extracted with acetonitrile and cleaned up with a multilayer solid-phase extraction cartridge Cleanert TPT using acetonitrile/toluene (3:1) as the elution solvent. Two internal standards of different polarity were used to meet simultaneous pesticides quantification demands in the tobacco matrix. Satisfactory linearity in the range of 10-500 ng/mL was obtained for all 114 pesticides with linear regression coefficients higher than 0.994. The limit of detection and limit of quantification values were 0.02-5.27 and 0.06-17.6 ng/g, respectively. For most of the pesticides, acceptable recoveries in the range of 70-120% and repeatabilities (relative standard deviation) of <11% were achieved at spiking levels of 20, 100, and 400 ng/g. Compared with the reported multiresidue analytical method, the proposed method provided a cleaner test solution with smaller amounts of pigments, fatty acids as well as other undesirable interferences. The development and validation of the high sensitivity, high selectivity, easy automation, and high-throughput analytical method meant that it could be successfully used for the determination of pesticides in tobacco samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Toledano, R M; Díaz-Plaza, E M; Cortes, J M; Aragón, A; Vázquez, A M; Villén, J; Muñoz-Guerra, J
2014-11-28
Boldenone (Bo), androsta-1,4-dien-17β-ol-3-one, is an anabolic androgenic steroid not clinically approved for human application. Despite this, many cases are reported every year of athletes testing positive for Bo or its main metabolite 5β-androst-1-en-17β-ol-3-one (BoM). Recently the capability of different human intestinal bacteria to produce enzymes able to modify endogenous steroids in Bo has been demonstrated. When a urinary concentration of Bo and/or BoM between 5 and 30 ng/mL is measured a complementary analysis by gas chromatography combustion isotope ratio mass spectrometry (GC-C-IRMS) must be carried out to discriminate the endogenous or exogenous origin. In the present work, a novel analytical method that couples LC-GC by means of the TOTAD interface with C-IRMS is described. The method is based on a first RPLC separation of unacetyled steroids, followed by acetylation and automated on-line LC-GC-C-IRMS, which includes a second RPLC clean-up of acetyl Bo and BoM, isolation of the two fractions in a fraction collector and their consecutive analysis by GC-C-IRMS. The method has been applied to the analysis of urine samples fortified at 5 and 10 ng/mL, where it has shown a good performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Resin transfer molding of textile preforms for aircraft structural applications
NASA Technical Reports Server (NTRS)
Hasko, Gregory H.; Dexter, H. Benson; Weideman, Mark H.
1992-01-01
The NASA LaRC is conducting and supporting research to develop cost-effective fabrication methods that are applicable to primary composite aircraft structures. One of the most promising fabrication methods that has evolved is resin transfer molding (RTM) of dry textile material forms. RTM has been used for many years for secondary structures, but has received increased emphasis because it is an excellent method for applying resin to damage-tolerant textile preforms at low cost. Textile preforms based on processes such as weaving, braiding, knitting, stitching, and combinations of these have been shown to offer significant improvements in damage tolerance compared to laminated tape composites. The use of low-cost resins combined with textile preforms could provide a major breakthrough in achieving cost-effective composite aircraft structures. RTM uses resin in its lowest cost form, and storage and spoilage costs are minimal. Near net shape textile preforms are expected to be cost-effective because automated machines can be used to produce the preforms, post-cure operations such as machining and fastening are minimized, and material scrap rate may be reduced in comparison with traditional prepreg molding. The purpose of this paper is to discuss experimental and analytical techniques that are under development at NASA Langley to aid the engineer in developing RTM processes for airframe structural elements. Included are experimental techniques to characterize preform and resin behavior and analytical methods that were developed to predict resin flow and cure kinetics.
USDA-ARS?s Scientific Manuscript database
With rapid advances in DNA sequencing, phenotyping has become the rate-limiting step in using large-scale genomic data to understand and improve agricultural crops. Here, the Bellwether Phenotyping platform for controlled-environment plant growth and automated, multimodal phenotyping is described. T...
Total organic carbon (TOC) and dissolved organic carbon (DOC) have long been used to estimate the amount of natural organic matter (NOM) found in raw and finished drinking water. In recent years, computer automation and improved instrumental analysis technologies have created a ...
USDA-ARS?s Scientific Manuscript database
Automated sensing of macronutrients in hydroponic solution would allow more efficient management of nutrients for crop growth in closed hydroponic systems. Ion-selective microelectrode technology requires an ion-selective membrane or a solid metal material that responds selectively to one analyte in...
Transaction-Level Learning Analytics in Online Authentic Assessments
ERIC Educational Resources Information Center
Nyland, Rob; Davies, Randall S.; Chapman, John; Allen, Gove
2017-01-01
This paper presents a case for the use of transaction-level data when analyzing automated online assessment results to identify knowledge gaps and misconceptions for individual students. Transaction-level data, which records all of the steps a student uses to complete an assessment item, are preferred over traditional assessment formats that…
Ali, M A; Ahsan, Z; Amin, M; Latif, S; Ayyaz, A; Ayyaz, M N
2016-05-01
Globally, disease surveillance systems are playing a significant role in outbreak detection and response management of Infectious Diseases (IDs). However, in developing countries like Pakistan, epidemic outbreaks are difficult to detect due to scarcity of public health data and absence of automated surveillance systems. Our research is intended to formulate an integrated service-oriented visual analytics architecture for ID surveillance, identify key constituents and set up a baseline for easy reproducibility of such systems in the future. This research focuses on development of ID-Viewer, which is a visual analytics decision support system for ID surveillance. It is a blend of intelligent approaches to make use of real-time streaming data from Emergency Departments (EDs) for early outbreak detection, health care resource allocation and epidemic response management. We have developed a robust service-oriented visual analytics architecture for ID surveillance, which provides automated mechanisms for ID data acquisition, outbreak detection and epidemic response management. Classification of chief-complaints is accomplished using dynamic classification module, which employs neural networks and fuzzy-logic to categorize syndromes. Standard routines by Center for Disease Control (CDC), i.e. c1-c3 (c1-mild, c2-medium and c3-ultra), and spatial scan statistics are employed for detection of temporal and spatio-temporal disease outbreaks respectively. Prediction of imminent disease threats is accomplished using support vector regression for early warnings and response planning. Geographical visual analytics displays are developed that allow interactive visualization of syndromic clusters, monitoring disease spread patterns, and identification of spatio-temporal risk zones. We analysed performance of surveillance framework using ID data for year 2011-2015. Dynamic syndromic classifier is able to classify chief-complaints to appropriate syndromes with high classification accuracy. Outbreak detection methods are able to detect the ID outbreaks in start of epidemic time zones. Prediction model is able to forecast dengue trend for 20 weeks ahead with nominal normalized root mean square error of 0.29. Interactive geo-spatiotemporal displays, i.e. heat-maps, and choropleth are shown in respective sections. The proposed framework will set a standard and provide necessary details for future implementation of such a system for resource-constrained regions. It will improve early outbreak detection attributable to natural and man-made biological threats, monitor spatio-temporal epidemic trends and provide assurance that an outbreak has, or has not occurred. Advanced analytics features will be beneficial in timely organization/formulation of health management policies, disease control activities and efficient health care resource allocation. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
A multisyringe flow-based system for kinetic-catalytic determination of cobalt(II).
Chaparro, Laura; Ferrer, Laura; Leal, Luz; Cerdà, Víctor
2015-02-01
A kinetic-catalytic method for cobalt determination based on the catalytic effect of cobalt(II) on the oxidative coupling of 1,2-dihydroxyanthraquinone (alizarin) was automated exploiting multisyringe flow injection analysis (MSFIA). The proposed method was performed at pH 9.2, resulting in a discoloration process in the presence of hydrogen peroxide. The fixed-time approach was employed for analytical signal measurement. The spectrophotometric detection was used exploiting a liquid waveguide capillary cell (LWCC), of 1m optical length at 465 nm. The optimization was carried out by a multivariate approach, reaching critical values of 124 µmol L(-1) and 0.22 mol L(-1) for alizarin and hydrogen peroxide, respectively, and 67°C of reagent temperature. A sample volume of 150 µL was used allowing a sampling rate of 30h(-1). Under optimal conditions, calibration curve was linear in the range of 1-200 µg L(-1) Co, achieving a DL of 0.3 µg L(-1) Co. The repeatability, expressed as relative standard deviation (RSD) was lower than 1%. The proposed analytical procedure was applied to the determination of cobalt in cobalt gluconate and different forms of vitamin B12, cyanocobalamin and hydroxicobalamin with successful results showing recoveries around 95%. Copyright © 2014 Elsevier B.V. All rights reserved.
Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki
2009-11-01
We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
Taylor, Carl; Lough, Fraser; Stanforth, Stephen P; Schwalbe, Edward C; Fowlis, Ian A; Dean, John R
2017-07-01
Listeria monocytogenes is a Gram-positive bacterium and an opportunistic food-borne pathogen which poses significant risk to the immune-compromised and pregnant due to the increased likelihood of acquiring infection and potential transmission of infection to the unborn child. Conventional methods of analysis suffer from either long turn-around times or lack the ability to discriminate between Listeria spp. reliably. This paper investigates an alternative method of detecting Listeria spp. using two novel enzyme substrates that liberate exogenous volatile organic compounds in the presence of α-mannosidase and D-alanyl aminopeptidase. The discriminating capabilities of this approach for identifying L. monocytogenes from other species of Listeria are investigated. The liberated volatile organic compounds (VOCs) are detected using an automated analytical technique based on static headspace-multi-capillary column-gas chromatography-ion mobility spectrometry (SHS-MCC-GC-IMS). The results obtained by SHS-MCC-GC-IMS are compared with those obtained by the more conventional analytical technique of headspace-solid phase microextraction-gas chromatography-mass spectrometry (HS-SPME-GC-MS). The results found that it was possible to differentiate between L. monocytogenes and L. ivanovii, based on their VOC response from α-mannosidase activity.
NASA Astrophysics Data System (ADS)
Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.
2017-05-01
Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.
Modelling and temporal performances evaluation of networked control systems using (max, +) algebra
NASA Astrophysics Data System (ADS)
Ammour, R.; Amari, S.
2015-01-01
In this paper, we address the problem of temporal performances evaluation of producer/consumer networked control systems. The aim is to develop a formal method for evaluating the response time of this type of control systems. Our approach consists on modelling, using Petri nets classes, the behaviour of the whole architecture including the switches that support multicast communications used by this protocol. (max, +) algebra formalism is then exploited to obtain analytical formulas of the response time and the maximal and minimal bounds. The main novelty is that our approach takes into account all delays experienced at the different stages of networked automation systems. Finally, we show how to apply the obtained results through an example of networked control system.
Holographic interferometry of oil films and droplets in water with a single-beam mirror-type scheme.
Kukhtarev, Nickolai; Kukhtareva, Tatiana; Gallegos, Sonia C
2011-03-01
Application of single-beam reflective laser optical interferometry for oil films and droplets in water detection and characterization is discussed. Oil films can be detected by the appearance of characteristic interference patterns. Analytical expressions describing intensity distribution in these interference patterns allow determination of oil film thickness, size of oil droplets, and distance to the oil film from the observation plane. Results from these analyses indicate that oil spill aging and breakup can be monitored in real time by analyzing time-dependent holographic fringe patterns. Interferometric methods of oil spill detection and characterization can be automated using digital holography with three-dimensional reconstruction of the time-changing oil spill topography. In this effort, the interferometric methods were applied to samples from Chevron oil and British Petroleum MC252 oil obtained during the Deep Water Horizon oil spill in the Gulf of Mexico. © 2011 Optical Society of America
NASA Technical Reports Server (NTRS)
Rickman, Doug; Shire, J.; Qualters, J.; Mitchell, K.; Pollard, S.; Rao, R.; Kajumba, N.; Quattrochi, D.; Estes, M., Jr.; Meyer, P.;
2009-01-01
Objectives. To provide an overview of four environmental public health surveillance projects developed by CDC and its partners for the Health and Environment Linked for Information Exchange, Atlanta (HELIX-Atlanta) and to illustrate common issues and challenges encountered in developing an environmental public health tracking system. Methods. HELIX-Atlanta, initiated in October 2003 to develop data linkage and analysis methods that can be used by the National Environmental Public Health Tracking Network (Tracking Network), conducted four projects. We highlight the projects' work, assess attainment of the HELIX-Atlanta goals and discuss three surveillance attributes. Results. Among the major challenges was the complexity of analytic issues which required multidiscipline teams with technical expertise. This expertise and the data resided across multiple organizations. Conclusions:Establishing formal procedures for sharing data, defining data analysis standards and automating analyses, and committing staff with appropriate expertise is needed to support wide implementation of environmental public health tracking.
Optimization and automation of quantitative NMR data extraction.
Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos
2013-06-18
NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.
Quantitative Analysis and Stability of the Rodenticide TETS ...
Journal Article The determination of the rodenticide tetramethylenedisulfotetramine (TETS) in drinking water is reportable through the use of automated sample preparation via solid phase extraction and detection using isotope dilution gas chromatography-mass spectrometry. The method was characterized over twenty-two analytical batches with quality control samples. Accuracies for low and high concentration quality control pools were 100 and 101%, respectively. The minimum reporting level (MRL) for TETS in this method is 0.50 ug/L. Five drinking waters representing a range of water quality parameters and disinfection practices were fortified with TETS at ten times the MRL and analyzed over a 28 day period to determine the stability of TETS in these waters. The amount of TETS measured in these samples averaged 100 ± 6% of the amount fortified suggesting that tap water samples may be held for up to 28 days prior to analysis.
Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.
Sonnleitner, Bernhard
2013-01-01
The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2016-01-01
Analysis of user interactions in online communities could improve our understanding of health-related behaviors and inform the design of technological solutions that support behavior change. However, to achieve this we would need methods that provide granular perspective, yet are scalable. In this paper, we present a methodology for high-throughput semantic and network analysis of large social media datasets, combining semi-automated text categorization with social network analytics. We apply this method to derive content-specific network visualizations of 16,492 user interactions in an online community for smoking cessation. Performance of the categorization system was reasonable (average F-measure of 0.74, with system-rater reliability approaching rater-rater reliability). The resulting semantically specific network analysis of user interactions reveals content- and behavior-specific network topologies. Implications for socio-behavioral health and wellness platforms are also discussed.
End-point detection in potentiometric titration by continuous wavelet transform.
Jakubowska, Małgorzata; Baś, Bogusław; Kubiak, Władysław W
2009-10-15
The aim of this work was construction of the new wavelet function and verification that a continuous wavelet transform with a specially defined dedicated mother wavelet is a useful tool for precise detection of end-point in a potentiometric titration. The proposed algorithm does not require any initial information about the nature or the type of analyte and/or the shape of the titration curve. The signal imperfection, as well as random noise or spikes has no influence on the operation of the procedure. The optimization of the new algorithm was done using simulated curves and next experimental data were considered. In the case of well-shaped and noise-free titration data, the proposed method gives the same accuracy and precision as commonly used algorithms. But, in the case of noisy or badly shaped curves, the presented approach works good (relative error mainly below 2% and coefficients of variability below 5%) while traditional procedures fail. Therefore, the proposed algorithm may be useful in interpretation of the experimental data and also in automation of the typical titration analysis, specially in the case when random noise interfere with analytical signal.
Multicenter evaluation of the Bayer Immuno I CA 15-3 assay.
Cheli, C D; Morris, D L; Kish, L; Goldblatt, J; Neaman, I; Allard, W J; Yeung, K K; Wu, A H; Moore, R; Chan, D W; Fritsche, H A; Schwartz, M K; Very, D L
1998-04-01
We conducted a multicenter evaluation of the analytical and clinical features of the automated Bayer Immuno 1 CA 15-3 assay and compared assay performance to two manual tests. Results of the 10-day imprecision study of the Bayer Immuno 1 assay pooled across four evaluation sites and three lots of reagent produced total CV < or = 4%. Lot-to-lot reproducibility for 26 different lots of reagents and calibrators manufactured over a 2-year period was demonstrated (CV, 1.1%). Results for the Bayer Immuno 1 assay correlated well with the Biomira TRUQUANT BR 27.29 and Centocor CA 15-3 RIAs (r > or = 0.94). The upper limit of the reference interval for the Bayer Immuno 1 assay was 35.9 kilounits/L (35.9 units/mL); values were similar for all methods. Longitudinal monitoring of healthy women yielded assay values with an average CV of 11% and 21% for the Bayer Immuno 1 and Biomira assays, respectively. The Bayer Immuno 1 assay demonstrated the analytical features, intermethod correlation, and long-term performance characteristics that are essential for longitudinal monitoring of breast cancer patients.
Riley, Paul W; Gallea, Benoit; Valcour, Andre
2017-01-01
Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process.
Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R
2016-01-01
Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.
Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner
2013-06-01
The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Points, Laurie J; Taylor, James Ward; Grizou, Jonathan; Donkers, Kevin; Cronin, Leroy
2018-01-30
Protocell models are used to investigate how cells might have first assembled on Earth. Some, like oil-in-water droplets, can be seemingly simple models, while able to exhibit complex and unpredictable behaviors. How such simple oil-in-water systems can come together to yield complex and life-like behaviors remains a key question. Herein, we illustrate how the combination of automated experimentation and image processing, physicochemical analysis, and machine learning allows significant advances to be made in understanding the driving forces behind oil-in-water droplet behaviors. Utilizing >7,000 experiments collected using an autonomous robotic platform, we illustrate how smart automation cannot only help with exploration, optimization, and discovery of new behaviors, but can also be core to developing fundamental understanding of such systems. Using this process, we were able to relate droplet formulation to behavior via predicted physical properties, and to identify and predict more occurrences of a rare collective droplet behavior, droplet swarming. Proton NMR spectroscopic and qualitative pH methods enabled us to better understand oil dissolution, chemical change, phase transitions, and droplet and aqueous phase flows, illustrating the utility of the combination of smart-automation and traditional analytical chemistry techniques. We further extended our study for the simultaneous exploration of both the oil and aqueous phases using a robotic platform. Overall, this work shows that the combination of chemistry, robotics, and artificial intelligence enables discovery, prediction, and mechanistic understanding in ways that no one approach could achieve alone.
Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C
2013-01-01
The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.