Cluster Size Optimization in Sensor Networks with Decentralized Cluster-Based Protocols
Amini, Navid; Vahdatpour, Alireza; Xu, Wenyao; Gerla, Mario; Sarrafzadeh, Majid
2011-01-01
Network lifetime and energy-efficiency are viewed as the dominating considerations in designing cluster-based communication protocols for wireless sensor networks. This paper analytically provides the optimal cluster size that minimizes the total energy expenditure in such networks, where all sensors communicate data through their elected cluster heads to the base station in a decentralized fashion. LEACH, LEACH-Coverage, and DBS comprise three cluster-based protocols investigated in this paper that do not require any centralized support from a certain node. The analytical outcomes are given in the form of closed-form expressions for various widely-used network configurations. Extensive simulations on different networks are used to confirm the expectations based on the analytical results. To obtain a thorough understanding of the results, cluster number variability problem is identified and inspected from the energy consumption point of view. PMID:22267882
Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.
2017-01-01
Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks
Wang, Jian; Jiang, Shuai; Fapojuwo, Abraham O.
2017-01-01
This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks. PMID:28555023
A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks.
Wang, Jian; Jiang, Shuai; Fapojuwo, Abraham O
2017-05-27
This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks.
Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein
2016-01-01
The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.
USDA-ARS GRACEnet Project Protocols, Chapter 3. Chamber-based trace gas flux measurements4
USDA-ARS?s Scientific Manuscript database
This protocol addresses N2O, CO2 and CH4 flux measurement by soil chamber methodology. The reactivities of other gasses of interest such as NOx O3, CO, and NH3 will require different chambers and associated instrumentation. Carbon dioxide is included as an analyte with this protocol; however, when p...
Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein
2016-01-01
Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Effect of different analyte diffusion/adsorption protocols on SERS signals
NASA Astrophysics Data System (ADS)
Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju
2018-07-01
The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.
Protocol for Detection of Yersinia pestis in Environmental ...
Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.
Comparison of Gluten Extraction Protocols Assessed by LC-MS/MS Analysis.
Fallahbaghery, Azadeh; Zou, Wei; Byrne, Keren; Howitt, Crispin A; Colgrave, Michelle L
2017-04-05
The efficiency of gluten extraction is of critical importance to the results derived from any analytical method for gluten detection and quantitation, whether it employs reagent-based technology (antibodies) or analytical instrumentation (mass spectrometry). If the target proteins are not efficiently extracted, the end result will be an under-estimation in the gluten content posing a health risk to people affected by conditions such as celiac disease (CD) and nonceliac gluten sensitivity (NCGS). Five different extraction protocols were investigated using LC-MRM-MS for their ability to efficiently and reproducibly extract gluten. The rapid and simple "IPA/DTT" protocol and related "two-step" protocol were enriched for gluten proteins, 55/86% (trypsin/chymotrypsin) and 41/68% of all protein identifications, respectively, with both methods showing high reproducibility (CV < 15%). When using multistep protocols, it was critical to examine all fractions, as coextraction of proteins occurred across fractions, with significant levels of proteins existing in unexpected fractions and not all proteins within a particular gluten class behaving the same.
Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio
2017-01-01
Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.
McClements, Jake; McClements, David Julian
2016-06-10
There has been a rapid increase in the fabrication of various kinds of edible nanoparticles for oral delivery of bioactive agents, such as those constructed from proteins, carbohydrates, lipids, and/or minerals. It is currently difficult to compare the relative advantages and disadvantages of different kinds of nanoparticle-based delivery systems because researchers use different analytical instruments and protocols to characterize them. In this paper, we briefly review the various analytical methods available for characterizing the properties of edible nanoparticles, such as composition, morphology, size, charge, physical state, and stability. This information is then used to propose a number of standardized protocols for characterizing nanoparticle properties, for evaluating their stability to environmental stresses, and for predicting their biological fate. Implementation of these protocols would facilitate comparison of the performance of nanoparticles under standardized conditions, which would facilitate the rational selection of nanoparticle-based delivery systems for different applications in the food, health care, and pharmaceutical industries.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
A Mobility-Aware QoS Signaling Protocol for Ambient Networks
NASA Astrophysics Data System (ADS)
Jeong, Seong-Ho; Lee, Sung-Hyuck; Bang, Jongho
Mobility-aware quality of service (QoS) signaling is crucial to provide seamless multimedia services in the ambient environment where mobile nodes may move frequently between different wireless access networks. The mobility of an IP-based node in ambient networks affects routing paths, and as a result, can have a significant impact on the operation and state management of QoS signaling protocols. In this paper, we first analyze the impact of mobility on QoS signaling protocols and how the protocols operate in mobility scenarios. We then propose an efficient mobility-aware QoS signaling protocol which can operate adaptively in ambient networks. The key features of the protocol include the fast discovery of a crossover node where the old and new paths converge or diverge due to handover and the localized state management for seamless services. Our analytical and simulation/experimental results show that the proposed/implemented protocol works better than existing protocols in the IP-based mobile environment.
Sefuba, Maria; Walingo, Tom; Takawira, Fambirai
2015-09-18
This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols.
Sefuba, Maria; Walingo, Tom; Takawira, Fambirai
2015-01-01
This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols. PMID:26393608
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP)
The Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) provides guidance for the planning, implementation and assessment phases of projects that require laboratory analysis of radionuclides.
Successful attack on permutation-parity-machine-based neural cryptography.
Seoane, Luís F; Ruttor, Andreas
2012-02-01
An algorithm is presented which implements a probabilistic attack on the key-exchange protocol based on permutation parity machines. Instead of imitating the synchronization of the communicating partners, the strategy consists of a Monte Carlo method to sample the space of possible weights during inner rounds and an analytic approach to convey the extracted information from one outer round to the next one. The results show that the protocol under attack fails to synchronize faster than an eavesdropper using this algorithm.
Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel
2013-12-15
In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.
HSRP and HSRP Partner Analytical Methods and Protocols
HSRP has worked with various partners to develop and test analytical methods and protocols for use by laboratories charged with analyzing environmental and/or buildling material samples following contamination incident.
Park, In-Sun; Park, Jae-Woo
2011-01-30
Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.
Fabricating a UV-Vis and Raman Spectroscopy Immunoassay Platform.
Hanson, Cynthia; Israelsen, Nathan D; Sieverts, Michael; Vargis, Elizabeth
2016-11-10
Immunoassays are used to detect proteins based on the presence of associated antibodies. Because of their extensive use in research and clinical settings, a large infrastructure of immunoassay instruments and materials can be found. For example, 96- and 384-well polystyrene plates are available commercially and have a standard design to accommodate ultraviolet-visible (UV-Vis) spectroscopy machines from various manufacturers. In addition, a wide variety of immunoglobulins, detection tags, and blocking agents for customized immunoassay designs such as enzyme-linked immunosorbent assays (ELISA) are available. Despite the existing infrastructure, standard ELISA kits do not meet all research needs, requiring individualized immunoassay development, which can be expensive and time-consuming. For example, ELISA kits have low multiplexing (detection of more than one analyte at a time) capabilities as they usually depend on fluorescence or colorimetric methods for detection. Colorimetric and fluorescent-based analyses have limited multiplexing capabilities due to broad spectral peaks. In contrast, Raman spectroscopy-based methods have a much greater capability for multiplexing due to narrow emission peaks. Another advantage of Raman spectroscopy is that Raman reporters experience significantly less photobleaching than fluorescent tags 1 . Despite the advantages that Raman reporters have over fluorescent and colorimetric tags, protocols to fabricate Raman-based immunoassays are limited. The purpose of this paper is to provide a protocol to prepare functionalized probes to use in conjunction with polystyrene plates for direct detection of analytes by UV-Vis analysis and Raman spectroscopy. This protocol will allow researchers to take a do-it-yourself approach for future multi-analyte detection while capitalizing on pre-established infrastructure.
Metabolic profiling of body fluids and multivariate data analysis.
Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten
2017-01-01
Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.
Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea
2016-11-01
Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors' best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well.
Hu, Miao; Zhong, Zhangdui; Ni, Minming; Baiocchi, Andrea
2016-01-01
Large volume content dissemination is pursued by the growing number of high quality applications for Vehicular Ad hoc NETworks(VANETs), e.g., the live road surveillance service and the video-based overtaking assistant service. For the highly dynamical vehicular network topology, beacon-less routing protocols have been proven to be efficient in achieving a balance between the system performance and the control overhead. However, to the authors’ best knowledge, the routing design for large volume content has not been well considered in the previous work, which will introduce new challenges, e.g., the enhanced connectivity requirement for a radio link. In this paper, a link Lifetime-aware Beacon-less Routing Protocol (LBRP) is designed for large volume content delivery in VANETs. Each vehicle makes the forwarding decision based on the message header information and its current state, including the speed and position information. A semi-Markov process analytical model is proposed to evaluate the expected delay in constructing one routing path for LBRP. Simulations show that the proposed LBRP scheme outperforms the traditional dissemination protocols in providing a low end-to-end delay. The analytical model is shown to exhibit a good match on the delay estimation with Monte Carlo simulations, as well. PMID:27809285
Hanford analytical sample projections FY 1998--FY 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, S.M.
1998-02-12
Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less
Squeezed-state quantum key distribution with a Rindler observer
NASA Astrophysics Data System (ADS)
Zhou, Jian; Shi, Ronghua; Guo, Ying
2018-03-01
Lengthening the maximum transmission distance of quantum key distribution plays a vital role in quantum information processing. In this paper, we propose a directional squeezed-state protocol with signals detected by a Rindler observer in the relativistic quantum field framework. We derive an analytical solution to the transmission problem of squeezed states from the inertial sender to the accelerated receiver. The variance of the involved signal mode is closer to optimality than that of the coherent-state-based protocol. Simulation results show that the proposed protocol has better performance than the coherent-state counterpart especially in terms of the maximal transmission distance.
Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma
NASA Astrophysics Data System (ADS)
Dhar, Purbarun; Sirisha Maganti, Lakshmi
2017-08-01
This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.
Wong, Anthony F; Pielmeier, Ulrike; Haug, Peter J; Andreassen, Steen
2016-01-01
Objective Develop an efficient non-clinical method for identifying promising computer-based protocols for clinical study. An in silico comparison can provide information that informs the decision to proceed to a clinical trial. The authors compared two existing computer-based insulin infusion protocols: eProtocol-insulin from Utah, USA, and Glucosafe from Denmark. Materials and Methods The authors used eProtocol-insulin to manage intensive care unit (ICU) hyperglycemia with intravenous (IV) insulin from 2004 to 2010. Recommendations accepted by the bedside clinicians directly link the subsequent blood glucose values to eProtocol-insulin recommendations and provide a unique clinical database. The authors retrospectively compared in silico 18 984 eProtocol-insulin continuous IV insulin infusion rate recommendations from 408 ICU patients with those of Glucosafe, the candidate computer-based protocol. The subsequent blood glucose measurement value (low, on target, high) was used to identify if the insulin recommendation was too high, on target, or too low. Results Glucosafe consistently provided more favorable continuous IV insulin infusion rate recommendations than eProtocol-insulin for on target (64% of comparisons), low (80% of comparisons), or high (70% of comparisons) blood glucose. Aggregated eProtocol-insulin and Glucosafe continuous IV insulin infusion rates were clinically similar though statistically significantly different (Wilcoxon signed rank test P = .01). In contrast, when stratified by low, on target, or high subsequent blood glucose measurement, insulin infusion rates from eProtocol-insulin and Glucosafe were statistically significantly different (Wilcoxon signed rank test, P < .001), and clinically different. Discussion This in silico comparison appears to be an efficient nonclinical method for identifying promising computer-based protocols. Conclusion Preclinical in silico comparison analytical framework allows rapid and inexpensive identification of computer-based protocol care strategies that justify expensive and burdensome clinical trials. PMID:26228765
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
Prigoff, Jake G; Swain, Gary W; Divino, Celia M
2016-05-01
Predicting the presence of a persistent common bile duct (CBD) stone is a difficult and expensive task. The aim of this study is to determine if a previously described protocol-based scoring system is a cost-effective strategy. The protocol includes all patients with gallstone pancreatitis and stratifies them based on laboratory values and imaging to high, medium, and low likelihood of persistent stones. The patient's stratification then dictates the next course of management. A decision analytic model was developed to compare the costs for patients who followed the protocol versus those that did not. Clinical data model inputs were obtained from a prospective study conducted at The Mount Sinai Medical Center to validate the protocol from Oct 2009 to May 2013. The study included all patients presenting with gallstone pancreatitis regardless of disease severity. Seventy-three patients followed the proposed protocol and 32 did not. The protocol group cost an average of $14,962/patient and the non-protocol group cost $17,138/patient for procedural costs. Mean length of stay for protocol and non-protocol patients was 5.6 and 7.7 days, respectively. The proposed protocol is a cost-effective way to determine the course for patients with gallstone pancreatitis, reducing total procedural costs over 12 %.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
QA/QC in the laboratory. Session F
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
Modelling the protocol stack in NCS with deterministic and stochastic petri net
NASA Astrophysics Data System (ADS)
Hui, Chen; Chunjie, Zhou; Weifeng, Zhu
2011-06-01
Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, T.F.; Thorne, P.G.; Myers, K.F.
Salting-out solvent extraction (SOE) was compared with cartridge and membrane solid-phase extraction (SPE) for preconcentration of nitroaromatics, nitramines, and aminonitroaromatics prior to determination by reversed-phase high-performance liquid chromatography. The solid phases used were manufacturer-cleaned materials, Porapak RDX for the cartridge method and Empore SDB-RPS for the membrane method. Thirty-three groundwater samples from the Naval Surface Warfare Center, Crane, Indiana, were analyzed using the direct analysis protocol specified in SW846 Method 8330, and the results were compared with analyses conducted after preconcentration using SOE with acetonitrile, cartridge-based SPE, and membrane-based SPE. For high-concentration samples, analytical results from the three preconcentration techniquesmore » were compared with results from the direct analysis protocol; good recovery of all target analytes was achieved by all three pre-concentration methods. For low-concentration samples, results from the two SPE methods were correlated with results from the SOE method; very similar data was obtained by the SOE and SPE methods, even at concentrations well below 1 microgram/L.« less
A splay tree-based approach for efficient resource location in P2P networks.
Zhou, Wei; Tan, Zilong; Yao, Shaowen; Wang, Shipu
2014-01-01
Resource location in structured P2P system has a critical influence on the system performance. Existing analytical studies of Chord protocol have shown some potential improvements in performance. In this paper a splay tree-based new Chord structure called SChord is proposed to improve the efficiency of locating resources. We consider a novel implementation of the Chord finger table (routing table) based on the splay tree. This approach extends the Chord finger table with additional routing entries. Adaptive routing algorithm is proposed for implementation, and it can be shown that hop count is significantly minimized without introducing any other protocol overheads. We analyze the hop count of the adaptive routing algorithm, as compared to Chord variants, and demonstrate sharp upper and lower bounds for both worst-case and average case settings. In addition, we theoretically analyze the hop reducing in SChord and derive the fact that SChord can significantly reduce the routing hops as compared to Chord. Several simulations are presented to evaluate the performance of the algorithm and support our analytical findings. The simulation results show the efficiency of SChord.
Distributed event-triggered consensus strategy for multi-agent systems under limited resources
NASA Astrophysics Data System (ADS)
Noorbakhsh, S. Mohammad; Ghaisari, Jafar
2016-01-01
The paper proposes a distributed structure to address an event-triggered consensus problem for multi-agent systems which aims at concurrent reduction in inter-agent communication, control input actuation and energy consumption. Following the proposed approach, asymptotic convergence of all agents to consensus requires that each agent broadcasts its sampled-state to the neighbours and updates its control input only at its own triggering instants, unlike the existing related works. Obviously, it decreases the network bandwidth usage, sensor energy consumption, computation resources usage and actuator wears. As a result, it facilitates the implementation of the proposed consensus protocol in the real-world applications with limited resources. The stability of the closed-loop system under an event-based protocol is proved analytically. Some numerical results are presented which confirm the analytical discussion on the effectiveness of the proposed design.
Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique
2018-03-01
Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Advertisement-Based Energy Efficient Medium Access Protocols for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Ray, Surjya Sarathi
One of the main challenges that prevents the large-scale deployment of Wireless Sensor Networks (WSNs) is providing the applications with the required quality of service (QoS) given the sensor nodes' limited energy supplies. WSNs are an important tool in supporting applications ranging from environmental and industrial monitoring, to battlefield surveillance and traffic control, among others. Most of these applications require sensors to function for long periods of time without human intervention and without battery replacement. Therefore, energy conservation is one of the main goals for protocols for WSNs. Energy conservation can be performed in different layers of the protocol stack. In particular, as the medium access control (MAC) layer can access and control the radio directly, large energy savings is possible through intelligent MAC protocol design. To maximize the network lifetime, MAC protocols for WSNs aim to minimize idle listening of the sensor nodes, packet collisions, and overhearing. Several approaches such as duty cycling and low power listening have been proposed at the MAC layer to achieve energy efficiency. In this thesis, I explore the possibility of further energy savings through the advertisement of data packets in the MAC layer. In the first part of my research, I propose Advertisement-MAC or ADV-MAC, a new MAC protocol for WSNs that utilizes the concept of advertising for data contention. This technique lets nodes listen dynamically to any desired transmission and sleep during transmissions not of interest. This minimizes the energy lost in idle listening and overhearing while maintaining an adaptive duty cycle to handle variable loads. Additionally, ADV-MAC enables energy efficient MAC-level multicasting. An analytical model for the packet delivery ratio and the energy consumption of the protocol is also proposed. The analytical model is verified with simulations and is used to choose an optimal value of the advertisement period. Simulations show that the optimized ADV-MAC provides substantial energy gains (50% to 70% less than other MAC protocols for WSNs such as T-MAC and S-MAC for the scenarios investigated) while faring as well as T-MAC in terms of packet delivery ratio and latency. Although ADV-MAC provides substantial energy gains over S-MAC and T-MAC, it is not optimal in terms of energy savings because contention is done twice -- once in the Advertisement Period and once in the Data Period. In the next part of my research, the second contention in the Data Period is eliminated and the advantages of contention-based and TDMA-based protocols are combined to form Advertisement based Time-division Multiple Access (ATMA), a distributed TDMA-based MAC protocol for WSNs. ATMA utilizes the bursty nature of the traffic to prevent energy waste through advertisements and reservations for data slots. Extensive simulations and qualitative analysis show that with bursty traffic, ATMA outperforms contention-based protocols (S-MAC, T-MAC and ADV-MAC), a TDMA based protocol (TRAMA) and hybrid protocols (Z-MAC and IEEE 802.15.4). ATMA provides energy reductions of up to 80%, while providing the best packet delivery ratio (close to 100%) and latency among all the investigated protocols. Simulations alone cannot reflect many of the challenges faced by real implementations of MAC protocols, such as clock-drift, synchronization, imperfect physical layers, and irregular interference from other transmissions. Such issues may cripple a protocol that otherwise performs very well in software simulations. Hence, to validate my research, I conclude with a hardware implementation of the ATMA protocol on SORA (Software Radio), developed by Microsoft Research Asia. SORA is a reprogrammable Software Defined Radio (SDR) platform that satisfies the throughput and timing requirements of modern wireless protocols while utilizing the rich general purpose PC development environment. Experimental results obtained from the hardware implementation of ATMA closely mirror the simulation results obtained for a single hop network with 4 nodes.
Methodological aspects of crossover and maximum fat-oxidation rate point determination.
Michallet, A-S; Tonini, J; Regnier, J; Guinot, M; Favre-Juvin, A; Bricout, V; Halimi, S; Wuyam, B; Flore, P
2008-11-01
Indirect calorimetry during exercise provides two metabolic indices of substrate oxidation balance: the crossover point (COP) and maximum fat oxidation rate (LIPOXmax). We aimed to study the effects of the analytical device, protocol type and ventilatory response on variability of these indices, and the relationship with lactate and ventilation thresholds. After maximum exercise testing, 14 relatively fit subjects (aged 32+/-10 years; nine men, five women) performed three submaximum graded tests: one was based on a theoretical maximum power (tMAP) reference; and two were based on the true maximum aerobic power (MAP). Gas exchange was measured concomitantly using a Douglas bag (D) and an ergospirometer (E). All metabolic indices were interpretable only when obtained by the D reference method and MAP protocol. Bland and Altman analysis showed overestimation of both indices with E versus D. Despite no mean differences between COP and LIPOXmax whether tMAP or MAP was used, the individual data clearly showed disagreement between the two protocols. Ventilation explained 10-16% of the metabolic index variations. COP was correlated with ventilation (r=0.96, P<0.01) and the rate of increase in blood lactate (r=0.79, P<0.01), and LIPOXmax correlated with the ventilation threshold (r=0.95, P<0.01). This study shows that, in fit healthy subjects, the analytical device, reference used to build the protocol and ventilation responses affect metabolic indices. In this population, and particularly to obtain interpretable metabolic indices, we recommend a protocol based on the true MAP or one adapted to include the transition from fat to carbohydrate. The correlation between metabolic indices and lactate/ventilation thresholds suggests that shorter, classical maximum progressive exercise testing may be an alternative means of estimating these indices in relatively fit subjects. However, this needs to be confirmed in patients who have metabolic defects.
Building America House Simulation Protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, Robert; Engebrecht, Cheryn
2010-09-01
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M
Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.
Cheating and Anti-Cheating in Gossip-Based Protocol: An Experimental Investigation
NASA Astrophysics Data System (ADS)
Xiao, Xin; Shi, Yuanchun; Tang, Yun; Zhang, Nan
During recent years, there has been a rapid growth in deployment of gossip-based protocol in many multicast applications. In a typical gossip-based protocol, each node acts as dual roles of receiver and sender, independently exchanging data with its neighbors to facilitate scalability and resilience. However, most of previous work in this literature seldom considered cheating issue of end users, which is also very important in face of the fact that the mutual cooperation inherently determines overall system performance. In this paper, we investigate the dishonest behaviors in decentralized gossip-based protocol through extensive experimental study. Our original contributions come in two-fold: In the first part of cheating study, we analytically discuss two typical cheating strategies, that is, intentionally increasing subscription requests and untruthfully calculating forwarding probability, and further evaluate their negative impacts. The results indicate that more attention should be paid to defending cheating behaviors in gossip-based protocol. In the second part of anti-cheating study, we propose a receiver-driven measurement mechanism, which evaluates individual forwarding traffic from the perspective of receivers and thus identifies cheating nodes with high incoming/outgoing ratio. Furthermore, we extend our mechanism by introducing reliable factor to further improve its accuracy. The experiments under various conditions show that it performs quite well in case of serious cheating and achieves considerable performance in other cases.
W-MAC: A Workload-Aware MAC Protocol for Heterogeneous Convergecast in Wireless Sensor Networks
Xia, Ming; Dong, Yabo; Lu, Dongming
2011-01-01
The power consumption and latency of existing MAC protocols for wireless sensor networks (WSNs) are high in heterogeneous convergecast, where each sensor node generates different amounts of data in one convergecast operation. To solve this problem, we present W-MAC, a workload-aware MAC protocol for heterogeneous convergecast in WSNs. A subtree-based iterative cascading scheduling mechanism and a workload-aware time slice allocation mechanism are proposed to minimize the power consumption of nodes, while offering a low data latency. In addition, an efficient schedule adjustment mechanism is provided for adapting to data traffic variation and network topology change. Analytical and simulation results show that the proposed protocol provides a significant energy saving and latency reduction in heterogeneous convergecast, and can effectively support data aggregation to further improve the performance. PMID:22163753
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C
2016-05-17
We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., and Process Heaters Using Portable Analyzers”, EMC Conditional Test Protocol 30 (CTM-30), Gas Research... cell design(s) conforming to this protocol will determine the analytical range for each gas component..., selective gas scrubbers, etc.) to meet the design specifications of this protocol. Do not make changes to...
MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...
NASA Astrophysics Data System (ADS)
Liu, Yongfang; Zhao, Yu; Chen, Guanrong
2016-11-01
This paper studies the distributed consensus and containment problems for a group of harmonic oscillators with a directed communication topology. First, for consensus without a leader, a class of distributed consensus protocols is designed by using motion planning and Pontryagin's principle. The proposed protocol only requires relative information measurements at the sampling instants, without requiring information exchange over the sampled interval. By using stability theory and the properties of stochastic matrices, it is proved that the distributed consensus problem can be solved in the motion planning framework. Second, for the case with multiple leaders, a class of distributed containment protocols is developed for followers such that their positions and velocities can ultimately converge to the convex hull formed by those of the leaders. Compared with the existing consensus algorithms, a remarkable advantage of the proposed sampled-data-based protocols is that the sampling periods, communication topologies and control gains are all decoupled and can be separately designed, which relaxes many restrictions in controllers design. Finally, some numerical examples are given to illustrate the effectiveness of the analytical results.
Analytical approach to cross-layer protocol optimization in wireless sensor networks
NASA Astrophysics Data System (ADS)
Hortos, William S.
2008-04-01
In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in terms of the concatenated protocol parameters. New source-to-destination routes are sought that optimize cross-layer interdependencies to achieve the "best available" performance in the WSN. The protocol design, modified from a known reactive protocol, adapts the achievable performance to the transient network conditions and resource levels. Control of network behavior is realized through the conditional rates of the MVPPs. Optimal cross-layer protocol parameters are determined by stochastic dynamic programming conditions derived from models of transient packetized sensor data flows. Moreover, the defining conditions for WSN configurations, grouping sensor nodes into clusters and establishing data aggregation at processing nodes within those clusters, lead to computationally tractable solutions to the stochastic differential equations that describe network dynamics. Closed-form solution characteristics provide an alternative to the "directed diffusion" methods for resource-efficient WSN protocols published previously by other researchers. Performance verification of the resulting cross-layer designs is found by embedding the optimality conditions for the protocols in actual WSN scenarios replicated in a wireless network simulation environment. Performance tradeoffs among protocol parameters remain for a sequel to the paper.
Analytic few-photon scattering in waveguide QED
NASA Astrophysics Data System (ADS)
Hurst, David L.; Kok, Pieter
2018-04-01
We develop an approach to light-matter coupling in waveguide QED based upon scattering amplitudes evaluated via Dyson series. For optical states containing more than single photons, terms in this series become increasingly complex, and we provide a diagrammatic recipe for their evaluation, which is capable of yielding analytic results. Our method fully specifies a combined emitter-optical state that permits investigation of light-matter entanglement generation protocols. We use our expressions to study two-photon scattering from a Λ -system and find that the pole structure of the transition amplitude is dramatically altered as the two ground states are tuned from degeneracy.
Critical factors for assembling a high volume of DNA barcodes
Hajibabaei, Mehrdad; deWaard, Jeremy R; Ivanova, Natalia V; Ratnasingham, Sujeevan; Dooh, Robert T; Kirk, Stephanie L; Mackie, Paula M; Hebert, Paul D.N
2005-01-01
Large-scale DNA barcoding projects are now moving toward activation while the creation of a comprehensive barcode library for eukaryotes will ultimately require the acquisition of some 100 million barcodes. To satisfy this need, analytical facilities must adopt protocols that can support the rapid, cost-effective assembly of barcodes. In this paper we discuss the prospects for establishing high volume DNA barcoding facilities by evaluating key steps in the analytical chain from specimens to barcodes. Alliances with members of the taxonomic community represent the most effective strategy for provisioning the analytical chain with specimens. The optimal protocols for DNA extraction and subsequent PCR amplification of the barcode region depend strongly on their condition, but production targets of 100K barcode records per year are now feasible for facilities working with compliant specimens. The analysis of museum collections is currently challenging, but PCR cocktails that combine polymerases with repair enzyme(s) promise future success. Barcode analysis is already a cost-effective option for species identification in some situations and this will increasingly be the case as reference libraries are assembled and analytical protocols are simplified. PMID:16214753
Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H
2016-05-30
For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol
Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.
2015-01-01
Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642
Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol.
Klonoff, David C; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A; Arreaza-Rubin, Guillermo; Burk, Robert D; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W
2016-05-01
Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled "Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program" is attached as supplementary material. This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. © 2015 Diabetes Technology Society.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...
Code of Federal Regulations, 2010 CFR
2010-07-01
... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...
Code of Federal Regulations, 2014 CFR
2014-07-01
... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...
Code of Federal Regulations, 2012 CFR
2012-07-01
... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...
Code of Federal Regulations, 2011 CFR
2011-07-01
... and laboratory purposes. Pursuant to Decision XI/15 of the Parties to the Montreal Protocol, effective... laboratory and analytical purposes is authorized provided that these laboratory and analytical chemicals..., restricted to laboratory use and analytical purposes and specifying that used or surplus substances should be...
Plant gum identification in historic artworks
Granzotto, Clara; Arslanoglu, Julie; Rolando, Christian; Tokarski, Caroline
2017-01-01
We describe an integrated and straightforward new analytical protocol that identifies plant gums from various sample sources including cultural heritage. Our approach is based on the identification of saccharidic fingerprints using mass spectrometry following controlled enzymatic hydrolysis. We developed an enzyme cocktail suitable for plant gums of unknown composition. Distinctive MS profiles of gums such as arabic, cherry and locust-bean gums were successfully identified. A wide range of oligosaccharidic combinations of pentose, hexose, deoxyhexose and hexuronic acid were accurately identified in gum arabic whereas cherry and locust bean gums showed respectively PentxHexy and Hexn profiles. Optimized for low sample quantities, the analytical protocol was successfully applied to contemporary and historic samples including ‘Colour Box Charles Roberson & Co’ dating 1870s and drawings from the American painter Arthur Dove (1880–1946). This is the first time that a gum is accurately identified in a cultural heritage sample using structural information. Furthermore, this methodology is applicable to other domains (food, cosmetic, pharmaceutical, biomedical). PMID:28425501
Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang
2016-01-01
This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954
Buch, Jesse S; Clark, Genevieve H; Cahill, Roberta; Thatcher, Brendon; Smith, Peter; Chandrashekar, Ramaswamy; Leutenegger, Christian M; O'Connor, Thomas P; Beall, Melissa J
2017-09-01
Feline leukemia virus (FeLV) is an oncogenic retrovirus of cats. Immunoassays for the p27 core protein of FeLV aid in the detection of FeLV infections. Commercial microtiter-plate ELISAs have rapid protocols and visual result interpretation, limiting their usefulness in high-throughput situations. The purpose of our study was to validate the PetChek FeLV 15 ELISA, which is designed for the reference laboratory, and incorporates sequential, orthogonal screening and confirmatory protocols. A cutoff for the screening assay was established with 100% accuracy using 309 feline samples (244 negative, 65 positive) defined by the combined results of FeLV PCR and an independent reference p27 antigen ELISA. Precision of the screening assay was measured using a panel of 3 samples (negative, low-positive, and high-positive). The intra-assay coefficient of variation (CV) was 3.9-7.9%; the inter-assay CV was 6.0-8.6%. For the confirmatory assay, the intra-assay CV was 3.0-4.7%, and the inter-assay CV was 7.4-9.7%. The analytical sensitivity for p27 antigen was 3.7 ng/mL for inactivated whole FeLV and 1.2 ng/mL for purified recombinant FeLV p27. Analytical specificity was demonstrated based on the absence of cross-reactivity to related retroviruses. No interference was observed for samples containing added bilirubin, hemoglobin, or lipids. Based on these results, the new high-throughput design of the PetChek FeLV 15 ELISA makes it suitable for use in reference laboratory settings and maintains overall analytical performance.
Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).
Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D
2017-01-01
Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.
DOT National Transportation Integrated Search
2016-12-25
The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...
Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks
2017-01-01
Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle’s safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given. PMID:29231882
Performance Analysis of the IEEE 802.11p Multichannel MAC Protocol in Vehicular Ad Hoc Networks.
Song, Caixia
2017-12-12
Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. The safety applications require timely and reliable transmissions, while the non-safety applications require efficient and high throughput. In the IEEE 1609.4 protocol, operating interval is divided into alternating Control Channel (CCH) interval and Service Channel (SCH) interval with an identical length. During the CCH interval, nodes transmit safety-related messages and control messages, and Enhanced Distributed Channel Access (EDCA) mechanism is employed to allow four Access Categories (ACs) within a station with different priorities according to their criticality for the vehicle's safety. During the SCH interval, the non-safety massages are transmitted. An analytical model is proposed in this paper to evaluate performance, reliability and efficiency of the IEEE 802.11p and IEEE 1609.4 protocols. The proposed model improves the existing work by taking serval aspects and the character of multichannel switching into design consideration. Extensive performance evaluations based on analysis and simulation help to validate the accuracy of the proposed model and analyze the capabilities and limitations of the IEEE 802.11p and IEEE 1609.4 protocols, and enhancement suggestions are given.
XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.
2009-01-01
Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.
Base Realignment and Closure Environmental Evaluation (BRAC EE) Fort Devens, Massachusetts
1995-09-01
Not Sampled ......... 6 2.3.2 Transformer Sites Sampled ........................ 7 2.4 Soil Sampling Protocol and Analytical Program ...Evaluation (AREE) 66. The study included evaluating the current PCB Transformer Management Program administered by the Fort Devens Environmental Management...Office (EMO), the Fort Devens Spill Contingency Plan, and the ongoing transformer inspection program . Personnel in both the Fort Devens EMO and the Fort
Comparison of methods for estimating density of forest songbirds from point counts
Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey
2011-01-01
New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...
Maier, Barbara; Vogeser, Michael
2013-04-01
Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.
Zarzycki, Paweł K; Portka, Joanna K
2015-09-01
Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves number of physicochemical parameters and hopanoids quantities or given biomarkers mass ratios derived from high-throughput separation and detection systems, typically GC-MS and HPLC-MS. Based on quantitative data reported in recently published experimental works it has been demonstrated that multivariate data analysis using e.g. principal components computations may significantly extend our knowledge concerning proper biomarkers selection and samples classification by means of hopanoids and related non-polar compounds. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bazakos, Christos; Khanfir, Emna; Aoun, Mariem; Spano, Thodhoraq; Zein, Zeina El; Chalak, Lamis; Riachy, Milad El; Abou-Sleymane, Gretta; Ali, Sihem Ben; Grati Kammoun, Naziha; Kalaitzis, Panagiotis
2016-07-01
Authentication and traceability of extra virgin olive oil is a challenging research task due to the complexity of fraudulent practices. In this context, the monovarietal olive oils of Protected Designation of Origin (PDO) and Protected Geographical Indication (PGI) require new tests and cutting edge analytical technologies to detect mislabeling and misleading origin. Toward this direction, DNA-based technologies could serve as a complementary to the analytical techniques assay. Single nucleotide polymorphisms are ideal molecular markers since they require short PCR analytical targets which are a prerequisite for forensic applications in olive oil sector. In the present study, a small number of polymorphic SNPs were used with an SNP-based PCR-RFLP capillary electrophoresis platform to discriminate six out of 13 monovarietal olive oils of Mediterranean origin from three different countries, Greece, Tunisia, and Lebanon. Moreover, the high sensitivity of capillary electrophoresis in combination with the DNA extraction protocol lowered the limit of detection to 10% in an admixture of Tsounati in a Koroneiki olive oil matrix. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cache-based error recovery for shared memory multiprocessor systems
NASA Technical Reports Server (NTRS)
Wu, Kun-Lung; Fuchs, W. Kent; Patel, Janak H.
1989-01-01
A multiprocessor cache-based checkpointing and recovery scheme for of recovering from transient processor errors in a shared-memory multiprocessor with private caches is presented. New implementation techniques that use checkpoint identifiers and recovery stacks to reduce performance degradation in processor utilization during normal execution are examined. This cache-based checkpointing technique prevents rollback propagation, provides for rapid recovery, and can be integrated into standard cache coherence protocols. An analytical model is used to estimate the relative performance of the scheme during normal execution. Extensions that take error latency into account are presented.
Sreemany, Arpita; Bera, Melinda Kumar; Sarkar, Anindya
2017-12-30
The elaborate sampling and analytical protocol associated with conventional dual-inlet isotope ratio mass spectrometry has long hindered high-resolution climate studies from biogenic accretionary carbonates. Laser-based on-line systems, in comparison, produce rapid data, but suffer from unresolvable matrix effects. It is, therefore, necessary to resolve these matrix effects to take advantage of the automated laser-based method. Two marine bivalve shells (one aragonite and one calcite) and one fish otolith (aragonite) were first analysed using a CO 2 laser ablation system attached to a continuous flow isotope ratio mass spectrometer under different experimental conditions (different laser power, sample untreated vs vacuum roasted). The shells and the otolith were then micro-drilled and the isotopic compositions of the powders were measured in a dual-inlet isotope ratio mass spectrometer following the conventional acid digestion method. The vacuum-roasted samples (both aragonite and calcite) produced mean isotopic ratios (with a reproducibility of ±0.2 ‰ for both δ 18 O and δ 13 C values) almost identical to the values obtained using the conventional acid digestion method. As the isotopic ratio of the acid digested samples fall within the analytical precision (±0.2 ‰) of the laser ablation system, this suggests the usefulness of the method for studying the biogenic accretionary carbonate matrix. When using laser-based continuous flow isotope ratio mass spectrometry for the high-resolution isotopic measurements of biogenic carbonates, the employment of a vacuum-roasting step will reduce the matrix effect. This method will be of immense help to geologists and sclerochronologists in exploring short-term changes in climatic parameters (e.g. seasonality) in geological times. Copyright © 2017 John Wiley & Sons, Ltd.
Waters, Ryan A.; Fowler, Veronica L.; Armson, Bryony; Nelson, Noel; Gloster, John; Paton, David J.; King, Donald P.
2014-01-01
Rapid, field-based diagnostic assays are desirable tools for the control of foot-and-mouth disease (FMD). Current approaches involve either; 1) Detection of FMD virus (FMDV) with immuochromatographic antigen lateral flow devices (LFD), which have relatively low analytical sensitivity, or 2) portable RT-qPCR that has high analytical sensitivity but is expensive. Loop-mediated isothermal amplification (LAMP) may provide a platform upon which to develop field based assays without these drawbacks. The objective of this study was to modify an FMDV-specific reverse transcription–LAMP (RT-LAMP) assay to enable detection of dual-labelled LAMP products with an LFD, and to evaluate simple sample processing protocols without nucleic acid extraction. The limit of detection of this assay was demonstrated to be equivalent to that of a laboratory based real-time RT-qPCR assay and to have a 10,000 fold higher analytical sensitivity than the FMDV-specific antigen LFD currently used in the field. Importantly, this study demonstrated that FMDV RNA could be detected from epithelial suspensions without the need for prior RNA extraction, utilising a rudimentary heat source for amplification. Once optimised, this RT-LAMP-LFD protocol was able to detect multiple serotypes from field epithelial samples, in addition to detecting FMDV in the air surrounding infected cattle, pigs and sheep, including pre-clinical detection. This study describes the development and evaluation of an assay format, which may be used as a future basis for rapid and low cost detection of FMDV. In addition it provides providing “proof of concept” for the future use of LAMP assays to tackle other challenging diagnostic scenarios encompassing veterinary and human health. PMID:25165973
Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.
2004-01-01
A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.
Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E; Mandl, René C; Almasy, Laura; Booth, Tom; Brouwer, Rachel M; Curran, Joanne E; de Zubicaray, Greig I; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T; Hong, L Elliot; Landman, Bennett A; Lemaitre, Hervé; Lopez, Lorna M; Martin, Nicholas G; McMahon, Katie L; Mitchell, Braxton D; Olvera, Rene L; Peterson, Charles P; Starr, John M; Sussmann, Jessika E; Toga, Arthur W; Wardlaw, Joanna M; Wright, Margaret J; Wright, Susan N; Bastin, Mark E; McIntosh, Andrew M; Boomsma, Dorret I; Kahn, René S; den Braber, Anouk; de Geus, Eco J C; Deary, Ian J; Hulshoff Pol, Hilleke E; Williamson, Douglas E; Blangero, John; van 't Ent, Dennis; Thompson, Paul M; Glahn, David C
2014-07-15
Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. Copyright © 2014 Elsevier Inc. All rights reserved.
Heavy vehicle driver workload assessment. Task 1, task analysis data and protocols review
DOT National Transportation Integrated Search
This report contains a review of available task analytic data and protocols pertinent to heavy vehicle operation and determination of the availability and relevance of such data to heavy vehicle driver workload assessment. Additionally, a preliminary...
Fidelity of Majorana-based quantum operations
NASA Astrophysics Data System (ADS)
Tanhayi Ahari, Mostafa; Ortiz, Gerardo; Seradjeh, Babak
2015-03-01
It is well known that one-dimensional p-wave superconductor, the so-called Kitaev model, has topologically distinct phases that are distinguished by the presence of Majorana fermions. Owing to their topological protection, these Majorana fermions have emerged as candidates for fault-tolerant quantum computation. They furnish the operation of such a computation via processes that produce, braid, and annihilate them in pairs. In this work we study some of these processes from the dynamical perspective. In particular, we determine the fidelity of the Majorana fermions when they are produced or annihilated by tuning the system through the corresponding topological phase transition. For a simple linear protocol, we derive analytical expressions for fidelity and test various perturbative schemes. For more general protocols, we present exact numerics. Our results are relevant for the operation of Majorana-based quantum gates and quantum memories.
Du, Guanyao; Yu, Jianjun
2016-01-01
This paper investigates the system achievable rate for the multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) system with an energy harvesting (EH) relay. Firstly we propose two protocols, time switching-based decode-and-forward relaying (TSDFR) and a flexible power splitting-based DF relaying (PSDFR) protocol by considering two practical receiver architectures, to enable the simultaneous information processing and energy harvesting at the relay. In PSDFR protocol, we introduce a temporal parameter to describe the time division pattern between the two phases which makes the protocol more flexible and general. In order to explore the system performance limit, we discuss the system achievable rate theoretically and formulate two optimization problems for the proposed protocols to maximize the system achievable rate. Since the problems are non-convex and difficult to solve, we first analyze them theoretically and get some explicit results, then design an augmented Lagrangian penalty function (ALPF) based algorithm for them. Numerical results are provided to validate the accuracy of our analytical results and the effectiveness of the proposed ALPF algorithm. It is shown that, PSDFR outperforms TSDFR to achieve higher achievable rate in such a MIMO-OFDM relaying system. Besides, we also investigate the impacts of the relay location, the number of antennas and the number of subcarriers on the system performance. Specifically, it is shown that, the relay position greatly affects the system performance of both protocols, and relatively worse achievable rate is achieved when the relay is placed in the middle of the source and the destination. This is different from the MIMO-OFDM DF relaying system without EH. Moreover, the optimal factor which indicates the time division pattern between the two phases in the PSDFR protocol is always above 0.8, which means that, the common division of the total transmission time into two equal phases in previous work applying PS-based receiver is not optimal.
Channel MAC Protocol for Opportunistic Communication in Ad Hoc Wireless Networks
NASA Astrophysics Data System (ADS)
Ashraf, Manzur; Jayasuriya, Aruna; Perreau, Sylvie
2008-12-01
Despite significant research effort, the performance of distributed medium access control methods has failed to meet theoretical expectations. This paper proposes a protocol named "Channel MAC" performing a fully distributed medium access control based on opportunistic communication principles. In this protocol, nodes access the channel when the channel quality increases beyond a threshold, while neighbouring nodes are deemed to be silent. Once a node starts transmitting, it will keep transmitting until the channel becomes "bad." We derive an analytical throughput limit for Channel MAC in a shared multiple access environment. Furthermore, three performance metrics of Channel MAC—throughput, fairness, and delay—are analysed in single hop and multihop scenarios using NS2 simulations. The simulation results show throughput performance improvement of up to 130% with Channel MAC over IEEE 802.11. We also show that the severe resource starvation problem (unfairness) of IEEE 802.11 in some network scenarios is reduced by the Channel MAC mechanism.
Traffic Adaptive Energy Efficient and Low Latency Medium Access Control for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Yadav, Rajesh; Varma, Shirshu; Malaviya, N.
2008-05-01
Medium access control for wireless sensor networks has been a very active research area in the recent years. The traditional wireless medium access control protocol such as IEEE 802.11 is not suitable for the sensor network application because these are battery powered. The recharging of these sensor nodes is expensive and also not possible. The most of the literature in the medium access for the sensor network focuses on the energy efficiency. The proposed MAC protocol solves the energy inefficiency caused by idle listening, control packet overhead and overhearing taking nodes latency into consideration based on the network traffic. Simulation experiments have been performed to demonstrate the effectiveness of the proposed approach. The validation of the simulation results of the proposed MAC has been done by comparing it with the analytical model. This protocol has been simulated in Network Simulator ns-2.
Rate-loss analysis of an efficient quantum repeater architecture
NASA Astrophysics Data System (ADS)
Guha, Saikat; Krovi, Hari; Fuchs, Christopher A.; Dutton, Zachary; Slater, Joshua A.; Simon, Christoph; Tittel, Wolfgang
2015-08-01
We analyze an entanglement-based quantum key distribution (QKD) architecture that uses a linear chain of quantum repeaters employing photon-pair sources, spectral-multiplexing, linear-optic Bell-state measurements, multimode quantum memories, and classical-only error correction. Assuming perfect sources, we find an exact expression for the secret-key rate, and an analytical description of how errors propagate through the repeater chain, as a function of various loss-and-noise parameters of the devices. We show via an explicit analytical calculation, which separately addresses the effects of the principle nonidealities, that this scheme achieves a secret-key rate that surpasses the Takeoka-Guha-Wilde bound—a recently found fundamental limit to the rate-vs-loss scaling achievable by any QKD protocol over a direct optical link—thereby providing one of the first rigorous proofs of the efficacy of a repeater protocol. We explicitly calculate the end-to-end shared noisy quantum state generated by the repeater chain, which could be useful for analyzing the performance of other non-QKD quantum protocols that require establishing long-distance entanglement. We evaluate that shared state's fidelity and the achievable entanglement-distillation rate, as a function of the number of repeater nodes, total range, and various loss-and-noise parameters of the system. We extend our theoretical analysis to encompass sources with nonzero two-pair-emission probability, using an efficient exact numerical evaluation of the quantum state propagation and measurements. We expect our results to spur formal rate-loss analysis of other repeater protocols and also to provide useful abstractions to seed analyses of quantum networks of complex topologies.
Clinical implementation of RNA signatures for pharmacogenomic decision-making
Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L
2011-01-01
RNA profiling is increasingly used to predict drug response, dose, or toxicity based on analysis of drug pharmacokinetic or pharmacodynamic pathways. Before implementing multiplexed RNA arrays in clinical practice, validation studies are carried out to demonstrate sufficient evidence of analytic and clinical performance, and to establish an assay protocol with quality assurance measures. Pathologists assure quality by selecting input tissue and by interpreting results in the context of the input tissue as well as the technologies that were used and the clinical setting in which the test was ordered. A strength of RNA profiling is the array-based measurement of tens to thousands of RNAs at once, including redundant tests for critical analytes or pathways to promote confidence in test results. Instrument and reagent manufacturers are crucial for supplying reliable components of the test system. Strategies for quality assurance include careful attention to RNA preservation and quality checks at pertinent steps in the assay protocol, beginning with specimen collection and proceeding through the various phases of transport, processing, storage, analysis, interpretation, and reporting. Specimen quality is checked by probing housekeeping transcripts, while spiked and exogenous controls serve as a check on analytic performance of the test system. Software is required to manipulate abundant array data and present it for interpretation by a laboratory physician who reports results in a manner facilitating therapeutic decision-making. Maintenance of the assay requires periodic documentation of personnel competency and laboratory proficiency. These strategies are shepherding genomic arrays into clinical settings to provide added value to patients and to the larger health care system. PMID:23226056
Indoor Exposure Product Testing Protocols Version 2
EPA’s Office of Pollution Prevention and Toxics (OPPT) has developed a set of ten indoor exposure testing protocols intended to provide information on the purpose of the testing, general description of the sampling and analytical procedures, and references for tests that will be ...
STANDARDIZATION AND VALIDATION OF MICROBIOLOGICAL METHODS FOR EXAMINATION OF BIOSOLIDS
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within a complex matrix. Implications of ...
LABORATORY MISCONDUCT - WHAT CAN HAPPEN TO YOU?
Contracted laboratories perform a vast number of routine and special analytical services that are the foundation of decisions upon which rests the fate of the environment. Guiding these laboratories in the generation of environmental data has been the analytical protocols and ...
NASA Technical Reports Server (NTRS)
Dhas, Chris
2000-01-01
NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.
Rota, Paola; Anastasia, Luigi; Allevi, Pietro
2015-05-07
The current analytical protocol used for the GC-MS determination of free or 1,7-lactonized natural sialic acids (Sias), as heptafluorobutyrates, overlooks several transformations. Using authentic reference standards and by combining GC-MS and NMR analyses, flaws in the analytical protocol were pinpointed and elucidated, thus establishing the scope and limitations of the method. It was demonstrated that (a) Sias 1,7-lactones, even if present in biological samples, decompose under the acidic hydrolysis conditions used for their release; (b) Sias 1,7-lactones are unpredicted artifacts, accidentally generated from their parent acids; (c) the N-acetyl group is quantitatively exchanged with that of the derivatizing perfluorinated anhydride; (d) the partial or complete failure of the Sias esterification-step with diazomethane leads to the incorrect quantification and structure attribution of all free Sias. While these findings prompt an urgent correction and improvement of the current analytical protocol, they could be instrumental for a critical revision of many incorrect claims reported in the literature.
Peyron, Pierre-Antoine; Baccino, Éric; Nagot, Nicolas; Lehmann, Sylvain; Delaby, Constance
2017-02-01
Determination of skin wound vitality is an important issue in forensic practice. No reliable biomarker currently exists. Quantification of inflammatory cytokines in injured skin with MSD ® technology is an innovative and promising approach. This preliminary study aims to develop a protocol for the preparation and the analysis of skin samples. Samples from ante mortem wounds, post mortem wounds, and intact skin ("control samples") were taken from corpses at the autopsy. After an optimization of the pre-analytical protocol had been performed in terms of skin homogeneisation and proteic extraction, the concentration of TNF-α was measured in each sample with the MSD ® approach. Then five other cytokines of interest (IL-1β, IL-6, IL-10, IL-12p70 and IFN-γ) were simultaneously quantified with a MSD ® multiplex assay. The optimal pre-analytical conditions consist in a proteic extraction from a 6 mm diameter skin sample, in a PBS buffer with triton 0,05%. Our results show the linearity and the reproductibility of the TNF-α quantification with MSD ® , and an inter- and intra-individual variability of the concentrations of proteins. The MSD ® multiplex assay is likely to detect differential skin concentrations for each cytokine of interest. This preliminary study was used to develop and optimize the pre-analytical and analytical conditions of the MSD ® method using injured and healthy skin samples, for the purpose of looking for and identifying the cytokine, or the set of cytokines, that may be biomarkers of skin wound vitality.
A review of blood sample handling and pre-processing for metabolomics studies.
Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta
2017-09-01
Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC
2016-06-15
Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less
Development of characterization protocol for mixed liquid radioactive waste classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zakaria, Norasalwa, E-mail: norasalwa@nuclearmalaysia.gov.my; Wafa, Syed Asraf; Wo, Yii Mei
2015-04-29
Mixed liquid organic waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclides posed specific challenges in its management. Often, these wastes become legacy waste in many nuclear facilities and being considered as ‘problematic’ waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this studymore » is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using various analytical procedures including gross alpha/ gross beta, gamma spectrometry, and LSC method. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste.« less
Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan
2016-11-01
Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.
A carrier sensed multiple access protocol for high data base rate ring networks
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, Kurt J.; Overstreet, C. Michael; Khanna, S.; Paterra, Frank
1990-01-01
The results of the study of a simple but effective media access protocol for high data rate networks are presented. The protocol is based on the fact that at high data rates networks can contain multiple messages simultaneously over their span, and that in a ring, nodes used to detect the presence of a message arriving from the immediate upstream neighbor. When an incoming signal is detected, the node must either abort or truncate a message it is presently sending. Thus, the protocol with local carrier sensing and multiple access is designated CSMA/RN. The performance of CSMA/RN with TTattempt and truncate is studied using analytic and simulation models. Three performance factors, wait or access time, service time and response or end-to-end travel time are presented. The service time is basically a function of the network rate, it changes by a factor of 1 between no load and full load. Wait time, which is zero for no load, remains small for load factors up to 70 percent of full load. Response time, which adds travel time while on the network to wait and service time, is mainly a function of network length, especially for longer distance networks. Simulation results are shown for CSMA/RN where messages are removed at the destination. A wide range of local and metropolitan area network parameters including variations in message size, network length, and node count are studied. Finally, a scaling factor based upon the ratio of message to network length demonstrates that the results, and hence, the CSMA/RN protocol, are applicable to wide area networks.
Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2017-07-01
There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.
NASA Astrophysics Data System (ADS)
Migliozzi, D.; Nguyen, H. T.; Gijs, M. A. M.
2018-02-01
Immunohistochemistry (IHC) is one of the main techniques currently used in the clinics for biomarker characterization. It consists in colorimetric labeling with specific antibodies followed by microscopy analysis. The results are then used for diagnosis and therapeutic targeting. Well-known drawbacks of such protocols are their limited accuracy and precision, which prevent the clinicians from having quantitative and robust IHC results. With our work, we combined rapid microfluidic immunofluorescent staining with efficient image-based cell segmentation and signal quantification to increase the robustness of both experimental and analytical protocols. The experimental protocol is very simple and based on fast-fluidic-exchange in a microfluidic chamber created on top of the formalin-fixed-paraffin-embedded (FFPE) slide by clamping it a silicon chip with a polydimethyl siloxane (PDMS) sealing ring. The image-processing protocol is based on enhancement and subsequent thresholding of the local contrast of the obtained fluorescence image. As a case study, given that the human epidermal growth factor receptor 2 (HER2) protein is often used as a biomarker for breast cancer, we applied our method to HER2+ and HER2- cell lines. We report very fast (5 minutes) immunofluorescence staining of both HER2 and cytokeratin (a marker used to define the tumor region) on FFPE slides. The image-processing program can segment cells correctly and give a cell-based quantitative immunofluorescent signal. With this method, we found a reproducible well-defined separation for the HER2-to-cytokeratin ratio for positive and negative control samples.
Chen, Yi; Fisher, Kate J.; Lloyd, Mark; Wood, Elizabeth R.; Coppola, Domenico; Siegel, Erin; Shibata, David; Chen, Yian A.; Koomen, John M.
2017-01-01
Quantitative evaluation of protein expression across multiple cancer-related signaling pathways (e.g. Wnt/β-catenin, TGF-β, receptor tyrosine kinases (RTK), MAP kinases, NF-κB, and apoptosis) in tumor tissues may enable the development of a molecular profile for each individual tumor that can aid in the selection of appropriate targeted cancer therapies. Here, we describe the development of a broadly applicable protocol to develop and implement quantitative mass spectrometry assays using cell line models and frozen tissue specimens from colon cancer patients. Cell lines are used to develop peptide-based assays for protein quantification, which are incorporated into a method based on SDS-PAGE protein fractionation, in-gel digestion, and liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM/MS). This analytical platform is then applied to frozen tumor tissues. This protocol can be broadly applied to the study of human disease using multiplexed LC-MRM assays. PMID:28808993
USDA-ARS?s Scientific Manuscript database
The antibody is central to the performance of an ELISA providing the basis of analyte selection and detection. It is the interaction of antibody with analyte under defined conditions that dictates the outcome of the ELISA and deviations in those conditions will impact assay performance. The aim of...
The research approached the large number and complexity of the analytes as four separate groups: technical toxaphene, toxaphene congeners (eight in number), chlordane, and organochlorine pesticides. This approach was advantageous because it eliminated potential interferences amon...
Maternal Serologic Screening to Prevent Congenital Toxoplasmosis: A Decision-Analytic Economic Model
Stillwaggon, Eileen; Carrier, Christopher S.; Sautter, Mari; McLeod, Rima
2011-01-01
Objective To determine a cost-minimizing option for congenital toxoplasmosis in the United States. Methodology/Principal Findings A decision-analytic and cost-minimization model was constructed to compare monthly maternal serological screening, prenatal treatment, and post-natal follow-up and treatment according to the current French (Paris) protocol, versus no systematic screening or perinatal treatment. Costs are based on published estimates of lifetime societal costs of developmental disabilities and current diagnostic and treatment costs. Probabilities are based on published results and clinical practice in the United States and France. One- and two-way sensitivity analyses are used to evaluate robustness of results. Universal monthly maternal screening for congenital toxoplasmosis with follow-up and treatment, following the French protocol, is found to be cost-saving, with savings of $620 per child screened. Results are robust to changes in test costs, value of statistical life, seroprevalence in women of childbearing age, fetal loss due to amniocentesis, and to bivariate analysis of test costs and incidence of primary T. gondii infection in pregnancy. Given the parameters in this model and a maternal screening test cost of $12, screening is cost-saving for rates of congenital infection above 1 per 10,000 live births. If universal testing generates economies of scale in diagnostic tools—lowering test costs to about $2 per test—universal screening is cost-saving at rates of congenital infection well below the lowest reported rates in the United States of 1 per 10,000 live births. Conclusion/Significance Universal screening according to the French protocol is cost saving for the US population within broad parameters for costs and probabilities. PMID:21980546
Multipinhole SPECT helical scan parameters and imaging volume
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang
Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less
Ganau, Mario; Prisco, Lara; Cebula, Helene; Todeschi, Julien; Abid, Houssem; Ligarotti, Gianfranco; Pop, Raoul; Proust, Francois; Chibbaro, Salvatore
2017-11-01
To analytically discuss some protocols in Deep vein thrombosis (DVT)/pulmonary Embolism (PE) prophylaxis currently use in Neurosurgical Departments around the world. Analysis of the prophylaxis protocols in the English literature: An analytical and narrative review of literature concerning DVT prophylaxis protocols in Neurosurgery have been conducted by a PubMed search (back to 1978). 80 abstracts were reviewed, and 74 articles were extracted. The majority of DVT seems to develop within the first week after a neurosurgical procedure, and a linear correlation between the duration of surgery and DVT occurrence has been highlighted. The incidence of DVT seems greater for cranial (7.7%) than spinal procedures (1.5%). Although intermittent pneumatic compression (IPC) devices provided adequate reduction of DVT/PE in some cranial and combined cranial/spinal series, low-dose subcutaneous unfractionated heparin (UFH) or low molecular-weight heparin (LMWH) further reduced the incidence, not always of DVT, but of PE. Nevertheless, low-dose heparin-based prophylaxis in cranial and spinal series risks minor and major postoperative haemorrhages: 2-4% in cranial series, 3.4% minor and 3.4% major haemorrhages in combined cranial/spinal series, and a 0.7% incidence of major/minor haemorrhages in spinal series. This analysis showed that currently most of the articles are represented by case series and case reports. As long as clear guidelines will not be defined and universally applied to this diverse group of patients, any prophylaxis for DVT and PE should be tailored to the individual patient with cautious assessment of benefits versus risks. Copyright © 2017 Elsevier Ltd. All rights reserved.
SCIENCE MISCONDUCT ACTIVITIES IN ENVIRONMENTAL ANALYSIS - FRAUD DETECTION IN GC/MS/ICP ACTIVITIES
Contracted laboratories perform a vast number of routine and special analytical services that are the foundation of decisions upon which rests the fate of the environment. Guiding these laboratories in the generation of environmental data has been the analytical protocols and th...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
NASA Astrophysics Data System (ADS)
Hahn, K. E.; Turner, E. C.; Kontak, D. J.; Fayek, M.
2018-02-01
Ancient carbonate rocks commonly contain numerous post-depositional phases (carbonate minerals; quartz) recording successive diagenetic events that can be deciphered and tied to known or inferred geological events using a multi-pronged in situ analytical protocol. The framework voids of large, deep-water microbial carbonate seep-mounds in Arctic Canada (Mesoproterozoic Ikpiarjuk Formation) contain multiple generations of synsedimentary and late cement. An in situ analytical study of the post-seafloor cements used optical and cathodoluminescence petrography, SEM-EDS analysis, fluid inclusion (FI) microthermometry and evaporate mound analysis, LA-ICP-MS analysis, and SIMS δ18O to decipher the mounds' long-term diagenetic history. The six void-filling late cements include, in paragenetic order: inclusion-rich euhedral dolomite (ED), finely crystalline clear dolomite (FCD), hematite-bearing dolomite (HD), coarsely crystalline clear dolomite (CCD), quartz (Q), replacive calcite (RC) and late calcite (LC). Based on the combined analytical results, the following fluid-flow history is defined: (1) ED precipitation by autocementation during shallow burial (fluid 1; Mesoproterozoic); (2) progressive mixing of Ca-rich hydrothermal fluid with the connate fluid, resulting in precipitation of FCD followed by HD (fluid 2; also Mesoproterozoic); (3) precipitation of hydrothermal dolomite (CCD) from high-Ca and K-rich fluids (fluid 3; possibly Mesoproterozoic, but timing unclear); (4) hydrothermal Q precipitation (fluid 4; timing unclear), and (5) RC and LC precipitation from a meteoric-derived water (fluid 5) in or since the Mesozoic. Fluids associated with FCD, HD, and CCD may have been mobilised during deposition of the upper Bylot Supergroup; this time interval was the most tectonically active episode in the region's Mesoproterozoic to Recent history. The entire history of intermittent fluid migration and cement precipitation recorded in seemingly unimportant void-filling mineral phases spans over 1 billion years, and was decipherable only because of the in situ protocol used. The multiple-method in situ analytical protocol employed in this study substantially augments the knowledge of an area's geological history, parts of which cannot be discerned by means other than meticulous study of diagenetic phases, and should become routine in similar studies.
Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W
2016-01-01
A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.
Yousaf, Sidrah; Javaid, Nadeem; Qasim, Umar; Alrajeh, Nabil; Khan, Zahoor Ali; Ahmed, Mansoor
2016-02-24
In this study, we analyse incremental cooperative communication for wireless body area networks (WBANs) with different numbers of relays. Energy efficiency (EE) and the packet error rate (PER) are investigated for different schemes. We propose a new cooperative communication scheme with three-stage relaying and compare it to existing schemes. Our proposed scheme provides reliable communication with less PER at the cost of surplus energy consumption. Analytical expressions for the EE of the proposed three-stage cooperative communication scheme are also derived, taking into account the effect of PER. Later on, the proposed three-stage incremental cooperation is implemented in a network layer protocol; enhanced incremental cooperative critical data transmission in emergencies for static WBANs (EInCo-CEStat). Extensive simulations are conducted to validate the proposed scheme. Results of incremental relay-based cooperative communication protocols are compared to two existing cooperative routing protocols: cooperative critical data transmission in emergencies for static WBANs (Co-CEStat) and InCo-CEStat. It is observed from the simulation results that incremental relay-based cooperation is more energy efficient than the existing conventional cooperation protocol, Co-CEStat. The results also reveal that EInCo-CEStat proves to be more reliable with less PER and higher throughput than both of the counterpart protocols. However, InCo-CEStat has less throughput with a greater stability period and network lifetime. Due to the availability of more redundant links, EInCo-CEStat achieves a reduced packet drop rate at the cost of increased energy consumption.
Yousaf, Sidrah; Javaid, Nadeem; Qasim, Umar; Alrajeh, Nabil; Khan, Zahoor Ali; Ahmed, Mansoor
2016-01-01
In this study, we analyse incremental cooperative communication for wireless body area networks (WBANs) with different numbers of relays. Energy efficiency (EE) and the packet error rate (PER) are investigated for different schemes. We propose a new cooperative communication scheme with three-stage relaying and compare it to existing schemes. Our proposed scheme provides reliable communication with less PER at the cost of surplus energy consumption. Analytical expressions for the EE of the proposed three-stage cooperative communication scheme are also derived, taking into account the effect of PER. Later on, the proposed three-stage incremental cooperation is implemented in a network layer protocol; enhanced incremental cooperative critical data transmission in emergencies for static WBANs (EInCo-CEStat). Extensive simulations are conducted to validate the proposed scheme. Results of incremental relay-based cooperative communication protocols are compared to two existing cooperative routing protocols: cooperative critical data transmission in emergencies for static WBANs (Co-CEStat) and InCo-CEStat. It is observed from the simulation results that incremental relay-based cooperation is more energy efficient than the existing conventional cooperation protocol, Co-CEStat. The results also reveal that EInCo-CEStat proves to be more reliable with less PER and higher throughput than both of the counterpart protocols. However, InCo-CEStat has less throughput with a greater stability period and network lifetime. Due to the availability of more redundant links, EInCo-CEStat achieves a reduced packet drop rate at the cost of increased energy consumption. PMID:26927104
Martos, Laura; Fernández-Pardo, Álvaro; Oto, Julia; Medina, Pilar; España, Francisco; Navarro, Silvia
2017-01-01
microRNAs are promising biomarkers in biological fluids in several diseases. Different plasma RNA isolation protocols and carriers are available, but their efficiencies have been scarcely compared. Plasma microRNAs were isolated using a phenol and column-based procedure and a column-based procedure, in the presence or absence of two RNA carriers (yeast RNA and MS2 RNA). We evaluated the presence of PCR inhibitors and the relative abundance of certain microRNAs by qRT-PCR. Furthermore, we analyzed the association between different isolation protocols, the relative abundance of the miRNAs in the sample, the GC content and the free energy of microRNAs. In all microRNAs analyzed, the addition of yeast RNA as a carrier in the different isolation protocols used gave lower raw Cq values, indicating higher microRNA recovery. Moreover, this increase in microRNAs recovery was dependent on their own relative abundance in the sample, their GC content and the free-energy of their own most stable secondary structure. Furthermore, the normalization of microRNA levels by an endogenous microRNA is more reliable than the normalization by plasma volume, as it reduced the difference in microRNA fold abundance between the different isolation protocols evaluated. Our thorough study indicates that a standardization of pre- and analytical conditions is necessary to obtain reproducible inter-laboratory results in plasma microRNA studies. PMID:29077772
Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.
Borowsky, Joseph; Collins, Greg E
2007-10-01
The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.
On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation
NASA Astrophysics Data System (ADS)
Ishikawa, Tetsuya; Hayakawa, Tomohisa
The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
USDA-ARS?s Scientific Manuscript database
Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...
Student Career Decisions: The Limits of Rationality.
ERIC Educational Resources Information Center
Baumgardner, Steve R.; Rappoport, Leon
This study compares modes of cognitive functioning revealed in student selection of a college major. Students were interviewed in-depth concerning reasons for their choice of majors. Protocol data suggested two distinct modes of thinking were evident on an analytic-intuitive dimension. For operational purposes analytic thinking was defined by…
Estimating Aquifer Properties Using Sinusoidal Pumping Tests
NASA Astrophysics Data System (ADS)
Rasmussen, T. C.; Haborak, K. G.; Young, M. H.
2001-12-01
We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.
Chaining for Flexible and High-Performance Key-Value Systems
2012-09-01
store that is fault tolerant achieves high performance and availability, and offers strong data consistency? We present a new replication protocol...effective high performance data access and analytics, many sites use simpler data model “ NoSQL ” systems. ese systems store and retrieve data only by...DRAM, Flash, and disk-based storage; can act as an unreliable cache or a durable store ; and can offer strong or weak data consistency. e value of
Klassen, Tara L.; von Rüden, Eva-Lotta; Drabek, Janice; Noebels, Jeffrey L.; Goldman, Alica M.
2013-01-01
Genetic testing and research have increased the demand for high-quality DNA that has traditionally been obtained by venipuncture. However, venous blood collection may prove difficult in special populations and when large-scale specimen collection or exchange is prerequisite for international collaborative investigations. Guthrie/FTA card–based blood spots, buccal scrapes, and finger nail clippings are DNA-containing specimens that are uniquely accessible and thus attractive as alternative tissue sources (ATS). The literature details a variety of protocols for extraction of nucleic acids from a singular ATS type, but their utility has not been systematically analyzed in comparison with conventional sources such as venous blood. Additionally, the efficacy of each protocol is often equated with the overall nucleic acid yield but not with the analytical performance of the DNA during mutation detection. Together with a critical in-depth literature review of published extraction methods, we developed and evaluated an all-inclusive approach for serial, systematic, and direct comparison of DNA utility from multiple biological samples. Our results point to the often underappreciated value of these alternative tissue sources and highlight ways to maximize the ATS-derived DNA for optimal quantity, quality, and utility as a function of extraction method. Our comparative analysis clarifies the value of ATS in genomic analysis projects for population-based screening, diagnostics, molecular autopsy, medico-legal investigations, or multi-organ surveys of suspected mosaicisms. PMID:22796560
Technology-assisted psychoanalysis.
Scharff, Jill Savege
2013-06-01
Teleanalysis-remote psychoanalysis by telephone, voice over internet protocol (VoIP), or videoteleconference (VTC)-has been thought of as a distortion of the frame that cannot support authentic analytic process. Yet it can augment continuity, permit optimum frequency of analytic sessions for in-depth analytic work, and enable outreach to analysands in areas far from specialized psychoanalytic centers. Theoretical arguments against teleanalysis are presented and countered and its advantages and disadvantages discussed. Vignettes of analytic process from teleanalytic sessions are presented, and indications, contraindications, and ethical concerns are addressed. The aim is to provide material from which to judge the authenticity of analytic process supported by technology.
Durning, Steven J; Graner, John; Artino, Anthony R; Pangaro, Louis N; Beckman, Thomas; Holmboe, Eric; Oakes, Terrance; Roy, Michael; Riedy, Gerard; Capaldi, Vincent; Walter, Robert; van der Vleuten, Cees; Schuwirth, Lambert
2012-09-01
Clinical reasoning is essential to medical practice, but because it entails internal mental processes, it is difficult to assess. Functional magnetic resonance imaging (fMRI) and think-aloud protocols may improve understanding of clinical reasoning as these methods can more directly assess these processes. The objective of our study was to use a combination of fMRI and think-aloud procedures to examine fMRI correlates of a leading theoretical model in clinical reasoning based on experimental findings to date: analytic (i.e., actively comparing and contrasting diagnostic entities) and nonanalytic (i.e., pattern recognition) reasoning. We hypothesized that there would be functional neuroimaging differences between analytic and nonanalytic reasoning theory. 17 board-certified experts in internal medicine answered and reflected on validated U.S. Medical Licensing Exam and American Board of Internal Medicine multiple-choice questions (easy and difficult) during an fMRI scan. This procedure was followed by completion of a formal think-aloud procedure. fMRI findings provide some support for the presence of analytic and nonanalytic reasoning systems. Statistically significant activation of prefrontal cortex distinguished answering incorrectly versus correctly (p < 0.01), whereas activation of precuneus and midtemporal gyrus distinguished not guessing from guessing (p < 0.01). We found limited fMRI evidence to support analytic and nonanalytic reasoning theory, as our results indicate functional differences with correct vs. incorrect answers and guessing vs. not guessing. However, our findings did not suggest one consistent fMRI activation pattern of internal medicine expertise. This model of employing fMRI correlates offers opportunities to enhance our understanding of theory, as well as improve our teaching and assessment of clinical reasoning, a key outcome of medical education.
2013-01-01
Background The use of computerized systems to support evidence-based practice is commonplace in contemporary medicine. Despite the prolific use of electronic support systems there has been relatively little research on the uptake of web-based systems in the oncology setting. Our objective was to examine the uptake of a web-based oncology protocol system (http://www.eviq.org.au) by Australian cancer clinicians. Methods We used web-logfiles and Google Analytics to examine the characteristics of eviQ registrants from October 2009-December 2011 and patterns of use by cancer clinicians during a typical month. Results As of December 2011, there were 16,037 registrants; 85% of whom were Australian health care professionals. During a typical month 87% of webhits occurred in standard clinical hours (08:00 to 18:00 weekdays). Raw webhits were proportional to the size of clinician groups: nurses (47% of Australian registrants), followed by doctors (20%), and pharmacists (14%). However, pharmacists had up to three times the webhit rate of other clinical groups. Clinicians spent five times longer viewing chemotherapy protocol pages than other content and the protocols viewed reflect the most common cancers: lung, breast and colorectal. Conclusions Our results demonstrate eviQ is used by a range of health professionals involved in cancer treatment at the point-of-care. Continued monitoring of electronic decision support systems is vital to understanding how they are used in clinical practice and their impact on processes of care and patient outcomes. PMID:23497080
A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies
Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine
2016-01-01
The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400
Predicting the behavior of microfluidic circuits made from discrete elements
Bhargava, Krisna C.; Thompson, Bryant; Iqbal, Danish; Malmstadt, Noah
2015-01-01
Microfluidic devices can be used to execute a variety of continuous flow analytical and synthetic chemistry protocols with a great degree of precision. The growing availability of additive manufacturing has enabled the design of microfluidic devices with new functionality and complexity. However, these devices are prone to larger manufacturing variation than is typical of those made with micromachining or soft lithography. In this report, we demonstrate a design-for-manufacturing workflow that addresses performance variation at the microfluidic element and circuit level, in context of mass-manufacturing and additive manufacturing. Our approach relies on discrete microfluidic elements that are characterized by their terminal hydraulic resistance and associated tolerance. Network analysis is employed to construct simple analytical design rules for model microfluidic circuits. Monte Carlo analysis is employed at both the individual element and circuit level to establish expected performance metrics for several specific circuit configurations. A protocol based on osmometry is used to experimentally probe mixing behavior in circuits in order to validate these approaches. The overall workflow is applied to two application circuits with immediate use at on the bench-top: series and parallel mixing circuits that are modularly programmable, virtually predictable, highly precise, and operable by hand. PMID:26516059
The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...
Development and Preliminary Evaluation of a FAP Protocol: Brief Relationship Enhancement
ERIC Educational Resources Information Center
Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis
2012-01-01
The purpose of this study was to develop a brief Functional Analytic Psychotherapy (FAP) protocol that will facilitate reliable implementation of FAP interventions, thus supporting research on FAP process and outcome. The treatment was a four-session individual therapy for clients who were interested in improving their relationship with their…
A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies
NASA Astrophysics Data System (ADS)
Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.
2009-06-01
Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.
A field-based cleaning protocol for sampling devices used in life-detection studies.
Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F
2009-06-01
Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.
Evaluation of Aspergillus PCR protocols for testing serum specimens.
White, P Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J G; McCulloch, Elaine; Barnes, Rosemary A; Donnelly, J Peter; Loeffler, Juergen
2011-11-01
A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance.
Evaluation of Aspergillus PCR Protocols for Testing Serum Specimens▿†
White, P. Lewis; Mengoli, Carlo; Bretagne, Stéphane; Cuenca-Estrella, Manuel; Finnstrom, Niklas; Klingspor, Lena; Melchers, Willem J. G.; McCulloch, Elaine; Barnes, Rosemary A.; Donnelly, J. Peter; Loeffler, Juergen
2011-01-01
A panel of human serum samples spiked with various amounts of Aspergillus fumigatus genomic DNA was distributed to 23 centers within the European Aspergillus PCR Initiative to determine analytical performance of PCR. Information regarding specific methodological components and PCR performance was requested. The information provided was made anonymous, and meta-regression analysis was performed to determine any procedural factors that significantly altered PCR performance. Ninety-seven percent of protocols were able to detect a threshold of 10 genomes/ml on at least one occasion, with 83% of protocols reproducibly detecting this concentration. Sensitivity and specificity were 86.1% and 93.6%, respectively. Positive associations between sensitivity and the use of larger sample volumes, an internal control PCR, and PCR targeting the internal transcribed spacer (ITS) region were shown. Negative associations between sensitivity and the use of larger elution volumes (≥100 μl) and PCR targeting the mitochondrial genes were demonstrated. Most Aspergillus PCR protocols used to test serum generate satisfactory analytical performance. Testing serum requires less standardization, and the specific recommendations shown in this article will only improve performance. PMID:21940479
Ramp and periodic dynamics across non-Ising critical points
NASA Astrophysics Data System (ADS)
Ghosh, Roopayan; Sen, Arnab; Sengupta, K.
2018-01-01
We study ramp and periodic dynamics of ultracold bosons in an one-dimensional (1D) optical lattice which supports quantum critical points separating a uniform and a Z3 or Z4 symmetry broken density-wave ground state. Our protocol involves both linear and periodic drives which takes the system from the uniform state to the quantum critical point (for linear drive protocol) or to the ordered state and back (for periodic drive protocols) via controlled variation of a parameter of the system Hamiltonian. We provide exact numerical computation, for finite-size boson chains with L ≤24 using exact diagonalization (ED), of the excitation density D , the wave function overlap F , and the excess energy Q at the end of the drive protocol. For the linear ramp protocol, we identify the range of ramp speeds for which D and Q show Kibble-Zurek scaling. We find, based on numerical analysis with L ≤24 , that such scaling is consistent with that expected from critical exponents of the q -state Potts universality class with q =3 ,4 . For the periodic protocol, we show that the model displays near-perfect dynamical freezing at specific frequencies; at these frequencies D ,Q →0 and |F |→1 . We provide a semi-analytic explanation of such freezing behavior and relate this phenomenon to a many-body version of Stuckelberg interference. We suggest experiments which can test our theory.
Laser direct-write for fabrication of three-dimensional paper-based devices.
He, P J W; Katis, I N; Eason, R W; Sones, C L
2016-08-16
We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.
Fast assessment of planar chromatographic layers quality using pulse thermovision method.
Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2014-12-19
The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, R.; Engebrecht, C.
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
Eckfeldt, J H; Copeland, K R
1993-04-01
Proficiency testing using stabilized control materials has been used for decades as a means of monitoring and improving performance in the clinical laboratory. Often, the commonly used proficiency testing materials exhibit "matrix effects" that cause them to behave differently from fresh human specimens in certain clinical analytic systems. Because proficiency testing is the primary method in which regulatory agencies have chosen to evaluate clinical laboratory performance, the College of American Pathologists (CAP) has proposed guidelines for investigating the influence of matrix effects on their Survey results. The purpose of this investigation was to determine the feasibility, usefulness, and potential problems associated with this CAP Matrix Effect Analytical Protocol, in which fresh patient specimens and CAP proficiency specimens are analyzed simultaneously by a field method and a definitive, reference, or other comparative method. The optimal outcome would be that both the fresh human and CAP Survey specimens agree closely with the comparative method result. However, this was not always the case. Using several different analytic configurations, we were able to demonstrate matrix and calibration biases for several of the analytes investigated.
Hybrid optimal scheduling for intermittent androgen suppression of prostate cancer
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; di Bernardo, Mario; Bruchovsky, Nicholas; Aihara, Kazuyuki
2010-12-01
We propose a method for achieving an optimal protocol of intermittent androgen suppression for the treatment of prostate cancer. Since the model that reproduces the dynamical behavior of the surrogate tumor marker, prostate specific antigen, is piecewise linear, we can obtain an analytical solution for the model. Based on this, we derive conditions for either stopping or delaying recurrent disease. The solution also provides a design principle for the most favorable schedule of treatment that minimizes the rate of expansion of the malignant cell population.
Engineering fluidic delays in paper-based devices using laser direct-writing.
He, P J W; Katis, I N; Eason, R W; Sones, C L
2015-10-21
We report the use of a new laser-based direct-write technique that allows programmable and timed fluid delivery in channels within a paper substrate which enables implementation of multi-step analytical assays. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depth and/or the porosity of hydrophobic barriers which, when fabricated in the fluid path, produce controllable fluid delay. We have patterned these flow delaying barriers at pre-defined locations in the fluidic channels using either a continuous wave laser at 405 nm, or a pulsed laser operating at 266 nm. Using this delay patterning protocol we generated flow delays spanning from a few minutes to over half an hour. Since the channels and flow delay barriers can be written via a common laser-writing process, this is a distinct improvement over other methods that require specialist operating environments, or custom-designed equipment. This technique can therefore be used for rapid fabrication of paper-based microfluidic devices that can perform single or multistep analytical assays.
The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...
Comparison of PCR methods for the detection of genetic variants of carp edema virus.
Adamek, Mikolaj; Matras, Marek; Jung-Schroers, Verena; Teitge, Felix; Heling, Max; Bergmann, Sven M; Reichert, Michal; Way, Keith; Stone, David M; Steinhagen, Dieter
2017-09-20
The infection of common carp and its ornamental variety, koi, with the carp edema virus (CEV) is often associated with the occurrence of a clinical disease called 'koi sleepy disease'. The disease may lead to high mortality in both koi and common carp populations. To prevent further spread of the infection and the disease, a reliable detection method for this virus is required. However, the high genetic variability of the CEV p4a gene used for PCR-based diagnostics could be a serious obstacle for successful and reliable detection of virus infection in field samples. By analysing 39 field samples from different geographical origins obtained from koi and farmed carp and from all 3 genogroups of CEV, using several recently available PCR protocols, we investigated which of the protocols would allow the detection of CEV from all known genogroups present in samples from Central European carp or koi populations. The comparison of 5 different PCR protocols showed that the PCR assays (both end-point and quantitative) developed in the Centre for Environment, Fisheries and Aquaculture Science exhibited the highest analytical inclusivity and diagnostic sensitivity. Currently, this makes them the most suitable protocols for detecting viruses from all known CEV genogroups.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2015-01-01
Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.
Obstetrical complications associated with abnormal maternal serum markers analytes.
Gagnon, Alain; Wilson, R Douglas
2008-10-01
To review the obstetrical outcomes associated with abnormally elevated or decreased level of one or more of the most frequently measured maternal serum marker analytes used in screening for aneuploidy. To provide guidance to facilitate the management of pregnancies that have abnormal levels of one of more markers and to assess the usefulness of these markers as a screening test. Perinatal outcomes associated with abnormal levels of maternal serum markers analytes are compared with the outcomes of pregnancies with normal levels of the same analytes or the general population. The Cochrane Library and Medline were searched for English-language articles published from 1966 to February 2007, relating to maternal serum markers and perinatal outcomes. Search terms included PAPP-A (pregnancy associated plasma protein A), AFP (alphafetoprotein), hCG (human chorionic gonadotropin), estriol, unconjugated estriol, inhibin, inhibin-A, maternal serum screen, triple marker screen, quadruple screen, integrated prenatal screen, first trimester screen, and combined prenatal screen. All study types were reviewed. Randomized controlled trials were considered evidence of the highest quality, followed by cohort studies. Key individual studies on which the recommendations are based are referenced. Supporting data for each recommendation are summarized with evaluative comments and references. The evidence was evaluated using the guidelines developed by the Canadian Task Force on Preventive Health Care. The evidence collected was reviewed by the Genetics Committee of the Society of Obstetricians and Gynaecologists of Canada. The benefit expected from this guideline is to facilitate early detection of potential adverse pregnancy outcomes when risks are identified at the time of a maternal serum screen. It will help further stratification of risk and provide options for pregnancy management to minimize the impact of pregnancy complications. The potential harms resulting from such practice are associated with the so called false positive (i.e., uncomplicated pregnancies labelled at increased risk for adverse perinatal outcomes), the potential stress associated with such a label, and the investigations performed for surveillance in this situation. No cost-benefit analysis is available to assess costs and savings associated with this guideline. SUMMARY STATEMENTS: 1. An unexplained level of a maternal serum marker analyte is defined as an abnormal level after confirmation of gestational age by ultrasound and exclusion of maternal, fetal, or placental causes for the abnormal level. (III) 2. Abnormally elevated levels of serum markers are associated with adverse pregnancy outcomes in twin pregnancies, after correction for the number of fetuses. Spontaneous or planned mutifetal reductions may result in abnormal elevations of serum markers. (II-2) RECOMMENDATIONS: 1. In the first trimester, an unexplained low PAPP-A (< 0.4 MoM) and/or a low hCG (< 0.5 MoM) are associated with an increased frequency of adverse obstetrical outcomes, and, at present, no specific protocol for treatment is available. (II-2A) In the second trimester, an unexplained elevation of maternal serum AFP (> 2.5 MoM), hCG (> 3.0 MoM), and/or inhibin-A (> or =2.0 MoM) or a decreased level of maternal serum AFP (< 0.25 MoM) and/or unconjugated estriol (< 0.5 MoM) are associated with an increased frequency of adverse obstetrical outcomes, and, at present, no specific protocol for treatment is available. (II-2A) 2. Pregnant woman with an unexplained elevated PAPP-A or hCG in the first trimester and an unexplained low hCG or inhibin-A and an unexplained elevated unconjugated estriol in the second trimester should receive normal antenatal care, as this pattern of analytes is not associated with adverse perinatal outcomes. (II-2A) 3. The combination of second or third trimester placenta previa and an unexplained elevated maternal serum AFP should increase the index of suspicion for placenta accreta, increta, or percreta. (II-2B) An assessment (ultrasound, MRI) of the placental-uterine interface should be performed. Abnormal invasion should be strongly suspected, and the planning of delivery location and technique should be done accordingly. (III-C) 4. A prenatal consultation with the medical genetics department is recommended for low unconjugated estriol levels (<0.3 MoM), as this analyte pattern can be associated with genetic conditions. (II-2B) 5. The clinical management protocol for identification of potential adverse obstetrical outcomes should be guided by one or more abnormal maternal serum marker analyte value rather than the false positive screening results for the trisomy 21 and/or the trisomy 18 screen. (II-2B) 6. Pregnant woman who are undergoing renal dialysis or who have had a renal transplant should be offered maternal serum screening, but interpretation of the result is difficult as the level of serum hCG is not reliable. (II-2A) 7. Abnormal maternal uterine artery Doppler in association with elevated maternal serum AFP, hCG, or inhibin-A or decreased PAPP-A identifies a group of women at greater risk of IUGR and gestational hypertension with proteinuria. Uterine artery Doppler measurements may be used in the evaluation of an unexplained abnormal level of either of these markers. (II-2B) 8. Further research is recommended to identify the best protocol for pregnancy management and surveillance in women identified at increased risk of adverse pregnancy outcomes based on an abnormality of a maternal serum screening analyte. (III-A) 9. In the absence of evidence supporting any specific surveillance protocol, an obstetrician should be consulted in order to establish a fetal surveillance plan specific to the increased obstetrical risks (maternal and fetal) identified. This plan may include enhanced patient education on signs and symptoms of the most common complications, increased frequency of antenatal visits, increased ultrasound (fetal growth, amniotic fluid levels), and fetal surveillance (biophysical profile, arterial and venous Doppler), and cervical length assessment. (III-A) 10. Limited information suggests that, in women with elevated hCG in the second trimester and/or abnormal uterine artery Doppler (at 22-24 weeks), low-dose aspirin (60-81 mg daily) is associated with higher birthweight and lower incidence of gestational hypertension with proteinuria. This therapy may be used in women who are at risk. (II-2B) 11. Further studies are recommended in order to assess the benefits of low-dose aspirin, low molecular weight heparin, or other therapeutic options in pregnancies determined to be at increased risk on the basis of an abnormal maternal serum screening analyte. (III-A) 12. Multiple maternal serum markers screening should not be used at present as a population-based screening method for adverse pregnancy outcomes (such as preeclampsia, placental abruption, and stillbirth) outside an established research protocol, as sensitivity is low, false positive rates are high, and no management protocol has been shown to clearly improve outcomes. (II-2D) When maternal serum screening is performed for the usual clinical indication (fetal aneuploidy and/or neural tube defect), abnormal analyte results can be utilized for the identification of pregnancies at risk and to direct their clinical management. (II-2B) Further studies are recommended to determine the optimal screening method for poor maternal and/or perinatal outcomes. (III-A).
Ullah, Sana; Kwak, Kyung Sup
2012-06-01
Wireless Body Area Network (WBAN) consists of low-power, miniaturized, and autonomous wireless sensor nodes that enable physicians to remotely monitor vital signs of patients and provide real-time feedback with medical diagnosis and consultations. It is the most reliable and cheaper way to take care of patients suffering from chronic diseases such as asthma, diabetes and cardiovascular diseases. Some of the most important attributes of WBAN is low-power consumption and delay. This can be achieved by introducing flexible duty cycling techniques on the energy constraint sensor nodes. Stated otherwise, low duty cycle nodes should not receive frequent synchronization and control packets if they have no data to send/receive. In this paper, we introduce a Traffic-adaptive MAC protocol (TaMAC) by taking into account the traffic information of the sensor nodes. The protocol dynamically adjusts the duty cycle of the sensor nodes according to their traffic-patterns, thus solving the idle listening and overhearing problems. The traffic-patterns of all sensor nodes are organized and maintained by the coordinator. The TaMAC protocol is supported by a wakeup radio that is used to accommodate emergency and on-demand events in a reliable manner. The wakeup radio uses a separate control channel along with the data channel and therefore it has considerably low power consumption requirements. Analytical expressions are derived to analyze and compare the performance of the TaMAC protocol with the well-known beacon-enabled IEEE 802.15.4 MAC, WiseMAC, and SMAC protocols. The analytical derivations are further validated by simulation results. It is shown that the TaMAC protocol outperforms all other protocols in terms of power consumption and delay.
S-curve networks and an approximate method for estimating degree distributions of complex networks
NASA Astrophysics Data System (ADS)
Guo, Jin-Li
2010-12-01
In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabási-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabási-Albert method commonly used in current network research.
A slotted access control protocol for metropolitan WDM ring networks
NASA Astrophysics Data System (ADS)
Baziana, P. A.; Pountourakis, I. E.
2009-03-01
In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.
2017-01-01
In an ideal plasmonic surface sensor, the bioactive area, where analytes are recognized by specific biomolecules, is surrounded by an area that is generally composed of a different material. The latter, often the surface of the supporting chip, is generally hard to be selectively functionalized, with respect to the active area. As a result, cross talks between the active area and the surrounding one may occur. In designing a plasmonic sensor, various issues must be addressed: the specificity of analyte recognition, the orientation of the immobilized biomolecule that acts as the analyte receptor, and the selectivity of surface coverage. The objective of this tutorial review is to introduce the main rational tools required for a correct and complete approach to chemically functionalize plasmonic surface biosensors. After a short introduction, the review discusses, in detail, the most common strategies for achieving effective surface functionalization. The most important issues, such as the orientation of active molecules and spatial and chemical selectivity, are considered. A list of well-defined protocols is suggested for the most common practical situations. Importantly, for the reported protocols, we also present direct comparisons in term of costs, labor demand, and risk vs benefit balance. In addition, a survey of the most used characterization techniques necessary to validate the chemical protocols is reported. PMID:28796479
Evaluation of two methods to determine glyphosate and AMPA in soils of Argentina
NASA Astrophysics Data System (ADS)
De Geronimo, Eduardo; Lorenzon, Claudio; Iwasita, Barbara; Faggioli, Valeria; Aparicio, Virginia; Costa, Jose Luis
2017-04-01
Argentine agricultural production is fundamentally based on a technological package combining no-tillage and the dependence of glyphosate applications to control weeds in transgenic crops (soybean, maize and cotton). Therefore, glyphosate is the most employed herbicide in the country, where 180 to 200 million liters are applied every year. Due to its widespread use, it is important to assess its impact on the environment and, therefore, reliable analytical methods are mandatory. Glyphosate molecule exhibits unique physical and chemical characteristics which difficult its quantification, especially in soils with high organic matter content, such as the central eastern Argentine soils, where strong interferences are normally observed. The objective of this work was to compare two methods for extraction and quantification of glyphosate and AMPA in samples of 8 representative soils of Argentina. The first analytical method (method 1) was based on the use of phosphate buffer as extracting solution and dichloromethane to minimize matrix organic content. In the second method (method 2), potassium hydroxide was used to extract the analytes followed by a clean-up step using solid phase extraction (SPE) to minimize strong interferences. Sensitivity, recoveries, matrix effects and robustness were evaluated. Both methodologies involved a derivatization with 9-fluorenyl-methyl-chloroformate (FMOC) in borate buffer and detection based on ultra-high-pressure liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Recoveries obtained from soil samples spiked at 0.1 and 1 mg kg-1 and were satisfactory in both methods (70% - 120%). However, there was a remarkable difference regarding the matrix effect, being the SPE clean-up step (method 2) insufficient to remove the interferences. Whereas the dilution and the clean-up with dichloromethane (method 1) were more effective minimizing the ionic suppression. Moreover, method 1 had fewer steps in the protocol of sample processing than method 2. This can be highly valuable in the routine lab work due to the reduction of potential undesired errors such as the loss of analyte or sample contamination. In addition, the substitution of SPE by another alternative involved a considerable reduction of analytical costs in method 1. We conclude that method 1 seemed to be simpler and cheaper than method 2, as well as reliable to quantify glyphosate in Argentinean soils. We hope that this experience can be useful to simplify the protocols of glyphosate quantification and contribute to the understanding of the fate of this herbicide in the environment.
Electroanalytical sensing of chromium(III) and (VI) utilising gold screen printed macro electrodes.
Metters, Jonathan P; Kadara, Rashid O; Banks, Craig E
2012-02-21
We report the fabrication of gold screen printed macro electrodes which are electrochemically characterised and contrasted to polycrystalline gold macroelectrodes with their potential analytical application towards the sensing of chromium(III) and (VI) critically explored. It is found that while these gold screen printed macro electrodes have electrode kinetics typically one order of magnitude lower than polycrystalline gold macroelectrodes as is measured via a standard redox probe, in terms of analytical sensing, these gold screen printed macro electrodes mimic polycrystalline gold in terms of their analytical performance towards the sensing of chromium(III) and (VI), whilst boasting additional advantages over the macro electrode due to their disposable one-shot nature and the ease of mass production. An additional advantage of these gold screen printed macro electrodes compared to polycrystalline gold is the alleviation of the requirement to potential cycle the latter to form the required gold oxide which aids in the simplification of the analytical protocol. We demonstrate that gold screen printed macro electrodes allow the low micro-molar sensing of chromium(VI) in aqueous solutions over the range 10 to 1600 μM with a limit of detection (3σ) of 4.4 μM. The feasibility of the analytical protocol is also tested through chromium(VI) detection in environmental samples.
2017-01-01
Acetamide has been classified as a possible human carcinogen, but uncertainties exist about its levels in foods. This report presents evidence that thermal decomposition of N-acetylated sugars and amino acids in heated gas chromatograph injectors contributes to artifactual acetamide in milk and beef. An alternative gas chromatography/mass spectrometry protocol based on derivatization of acetamide with 9-xanthydrol was optimized and shown to be free of artifactual acetamide formation. The protocol was validated using a surrogate analyte approach based on d3-acetamide and applied to analyze 23 pasteurized whole milk, 44 raw sirloin beef, and raw milk samples from 14 different cows, and yielded levels about 10-fold lower than those obtained by direct injection without derivatization. The xanthydrol derivatization procedure detected acetamide in every food sample tested at 390 ± 60 ppb in milk, 400 ± 80 ppb in beef, and 39 000 ± 9000 ppb in roasted coffee beans. PMID:29186951
SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid
Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin
2016-01-01
The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid. PMID:27043573
SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid.
Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin
2016-03-31
The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid.
Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.
2016-01-01
Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614
Leonard, Susan R.; Mammel, Mark K.; Lacher, David W.
2015-01-01
Culture-independent diagnostics reduce the reliance on traditional (and slower) culture-based methodologies. Here we capitalize on advances in next-generation sequencing (NGS) to apply this approach to food pathogen detection utilizing NGS as an analytical tool. In this study, spiking spinach with Shiga toxin-producing Escherichia coli (STEC) following an established FDA culture-based protocol was used in conjunction with shotgun metagenomic sequencing to determine the limits of detection, sensitivity, and specificity levels and to obtain information on the microbiology of the protocol. We show that an expected level of contamination (∼10 CFU/100 g) could be adequately detected (including key virulence determinants and strain-level specificity) within 8 h of enrichment at a sequencing depth of 10,000,000 reads. We also rationalize the relative benefit of static versus shaking culture conditions and the addition of selected antimicrobial agents, thereby validating the long-standing culture-based parameters behind such protocols. Moreover, the shotgun metagenomic approach was informative regarding the dynamics of microbial communities during the enrichment process, including initial surveys of the microbial loads associated with bagged spinach; the microbes found included key genera such as Pseudomonas, Pantoea, and Exiguobacterium. Collectively, our metagenomic study highlights and considers various parameters required for transitioning to such sequencing-based diagnostics for food safety and the potential to develop better enrichment processes in a high-throughput manner not previously possible. Future studies will investigate new species-specific DNA signature target regimens, rational design of medium components in concert with judicious use of additives, such as antibiotics, and alterations in the sample processing protocol to enhance detection. PMID:26386062
Bapiro, Tashinga E; Richards, Frances M; Goldgraben, Mae A; Olive, Kenneth P; Madhu, Basetti; Frese, Kristopher K; Cook, Natalie; Jacobetz, Michael A; Smith, Donna-Michelle; Tuveson, David A; Griffiths, John R; Jodrell, Duncan I
2011-11-01
To develop a sensitive analytical method to quantify gemcitabine (2',2'-difluorodeoxycytidine, dFdC) and its metabolites 2',2'-difluorodeoxyuridine (dFdU) and 2',2'-difluorodeoxycytidine-5'-triphosphate (dFdCTP) simultaneously from tumour tissue. Pancreatic ductal adenocarcinoma tumour tissue from genetically engineered mouse models of pancreatic cancer (KP ( FL/FL ) C and KP ( R172H/+) C) was collected after dosing the mice with gemcitabine. (19)F NMR spectroscopy and LC-MS/MS protocols were optimised to detect gemcitabine and its metabolites in homogenates of the tumour tissue. A (19)F NMR protocol was developed, which was capable of distinguishing the three analytes in tumour homogenates. However, it required at least 100 mg of the tissue in question and a long acquisition time per sample, making it impractical for use in large PK/PD studies or clinical trials. The LC-MS/MS protocol was developed using porous graphitic carbon to separate the analytes, enabling simultaneous detection of all three analytes from as little as 10 mg of tissue, with a sensitivity for dFdCTP of 0.2 ng/mg tissue. Multiple pieces of tissue from single tumours were analysed, showing little intra-tumour variation in the concentrations of dFdC or dFdU (both intra- and extra-cellular). Intra-tumoural variation was observed in the concentration of dFdCTP, an intra-cellular metabolite, which may reflect regions of different cellularity within a tumour. We have developed a sensitive LC-MS/MS method capable of quantifying gemcitabine, dFdU and dFdCTP in pancreatic tumour tissue. The requirement for only 10 mg of tissue enables this protocol to be used to analyse multiple areas from a single tumour and to spare tissue for additional pharmacodynamic assays.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R; Ware, Anne E
Two catalytic fast pyrolysis (CFP) oils (bottom/heavy fraction) were analyzed in various solvents that are used in common analytical methods (nuclear magnetic resonance - NMR, gas chromatography - GC, gel permeation chromatography - GPC, thermogravimetric analysis - TGA) for oil characterization and speciation. A more accurate analysis of the CFP oils can be obtained by identification and exploitation of solvent miscibility characteristics. Acetone and tetrahydrofuran can be used to completely solubilize CFP oils for analysis by GC and tetrahydrofuran can be used for traditional organic GPC analysis of the oils. DMSO-d6 can be used to solubilize CFP oils for analysismore » by 13C NMR. The fractionation of oils into solvents that did not completely solubilize the whole oils showed that miscibility can be related to the oil properties. This allows for solvent selection based on physico-chemical properties of the oils. However, based on semi-quantitative comparisons of the GC chromatograms, the organic solvent fractionation schemes did not speciate the oils based on specific analyte type. On the other hand, chlorinated solvents did fractionate the oils based on analyte size to a certain degree. Unfortunately, like raw pyrolysis oil, the matrix of the CFP oils is complicated and is not amenable to simple liquid-liquid extraction (LLE) or solvent fractionation to separate the oils based on the chemical and/or physical properties of individual components. For reliable analyses, for each analytical method used, it is critical that the bio-oil sample is both completely soluble and also not likely to react with the chosen solvent. The adoption of the standardized solvent selection protocols presented here will allow for greater reproducibility of analysis across different users and facilities.« less
Xu, Zhezhuang; Chen, Liquan; Liu, Ting; Cao, Lianyang; Chen, Cailian
2015-10-20
Multi-hop data collection in wireless sensor networks (WSNs) is a challenge issue due to the limited energy resource and transmission range of wireless sensors. The hybrid clustering and routing (HCR) strategy has provided an effective solution, which can generate a connected and efficient cluster-based topology for multi-hop data collection in WSNs. However, it suffers from imbalanced energy consumption, which results in the poor performance of the network lifetime. In this paper, we evaluate the energy consumption of HCR and discover an important result: the imbalanced energy consumption generally appears in gradient k = 1, i.e., the nodes that can communicate with the sink directly. Based on this observation, we propose a new protocol called HCR-1, which includes the adaptive relay selection and tunable cost functions to balance the energy consumption. The guideline of setting the parameters in HCR-1 is provided based on simulations. The analytical and numerical results prove that, with minor modification of the topology in Sensors 2015, 15 26584 gradient k = 1, the HCR-1 protocol effectively balances the energy consumption and prolongs the network lifetime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr; Papin, Arnaud; Padox, Jean-Marie
Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazardmore » properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and vegetable oil). The protocol is submitted to French and European normalization bodies (AFNOR and CEN) and further improvements are awaited.« less
Network Analysis Tools: from biological networks to clusters and pathways.
Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques
2008-01-01
Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.
Coherence in quantum estimation
NASA Astrophysics Data System (ADS)
Giorda, Paolo; Allegra, Michele
2018-01-01
The geometry of quantum states provides a unifying framework for estimation processes based on quantum probes, and it establishes the ultimate bounds of the achievable precision. We show a relation between the statistical distance between infinitesimally close quantum states and the second order variation of the coherence of the optimal measurement basis with respect to the state of the probe. In quantum phase estimation protocols, this leads to propose coherence as the relevant resource that one has to engineer and control to optimize the estimation precision. Furthermore, the main object of the theory i.e. the symmetric logarithmic derivative, in many cases allows one to identify a proper factorization of the whole Hilbert space in two subsystems. The factorization allows one to discuss the role of coherence versus correlations in estimation protocols; to show how certain estimation processes can be completely or effectively described within a single-qubit subsystem; and to derive lower bounds for the scaling of the estimation precision with the number of probes used. We illustrate how the framework works for both noiseless and noisy estimation procedures, in particular those based on multi-qubit GHZ-states. Finally we succinctly analyze estimation protocols based on zero-temperature critical behavior. We identify the coherence that is at the heart of their efficiency, and we show how it exhibits the non-analyticities and scaling behavior proper of a large class of quantum phase transitions.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
Eggert, Corinne; Moselle, Kenneth; Protti, Denis; Sanders, Dale
2017-01-01
Closed Loop Analytics© is receiving growing interest in healthcare as a term referring to information technology, local data and clinical analytics working together to generate evidence for improvement. The Closed Loop Analytics model consists of three loops corresponding to the decision-making levels of an organization and the associated data within each loop - Patients, Protocols, and Populations. The authors propose that each of these levels should utilize the same ecosystem of electronic health record (EHR) and enterprise data warehouse (EDW) enabled data, in a closed-loop fashion, with that data being repackaged and delivered to suit the analytic and decision support needs of each level, in support of better outcomes.
Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates
Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.; ...
2014-11-04
In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less
Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.
In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less
Efficient Online Optimized Quantum Control for Adiabatic Quantum Computation
NASA Astrophysics Data System (ADS)
Quiroz, Gregory
Adiabatic quantum computation (AQC) relies on controlled adiabatic evolution to implement a quantum algorithm. While control evolution can take many forms, properly designed time-optimal control has been shown to be particularly advantageous for AQC. Grover's search algorithm is one such example where analytically-derived time-optimal control leads to improved scaling of the minimum energy gap between the ground state and first excited state and thus, the well-known quadratic quantum speedup. Analytical extensions beyond Grover's search algorithm present a daunting task that requires potentially intractable calculations of energy gaps and a significant degree of model certainty. Here, an in situ quantum control protocol is developed for AQC. The approach is shown to yield controls that approach the analytically-derived time-optimal controls for Grover's search algorithm. In addition, the protocol's convergence rate as a function of iteration number is shown to be essentially independent of system size. Thus, the approach is potentially scalable to many-qubit systems.
Hanaoka, Shigeyuki; Nomura, Koji; Kudo, Shinichi
2005-09-02
Knowledge of the exact nature of the constituents of abandoned chemical weapons (ACW) is a prerequisite for their orderly destruction. Here we report the development of analytical procedures to identify diphenylchloroarsine (DA/Clark I), diphenylcyanoarsine (DC/Clark II) and related substances employed in one of the munitions known as "Red canister". Both DA and DC are relatively unstable under conventional analytical procedures without thiol derivatization. Unfortunately however, thiol drivatization affords the same volatile organo-arsenic derivative from several different diphenylarsenic compounds, making it impossible to identify and quantify the original compounds. Further, diminishing the analytical interference caused by the celluloid powder used as a stacking material in the weapons, is also essential for accurate analysis. In this study, extraction and instrumental conditions have been evaluated and an optimal protocol was determined. The analysis of Red canister samples following this protocol showed that most of the DA and DC associated with pumice had degraded to bis(diphenylarsine)oxide (BDPAO), while those associated with celluloid were dominantly degraded to diphenylarsinic acid (DPAA).
Development of the Basis for an Analytical Protocol for Feeds and Products of Bio-oil Hydrotreatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oasmaa, Anja; Kuoppala, Eeva; Elliott, Douglas C.
2012-04-02
Methods for easily following the main changes in the composition, stability, and acidity of bio-oil in hydrotreatment are presented. The correlation to more conventional methods is provided. Depending on the final use the upgrading requirement is different. This will create challenges also for the analytical protocol. Polar pyrolysis liquids and their products can be divided into five main groups with solvent fractionation the change in which is easy to follow. This method has over ten years been successfully used for comparison of fast pyrolysis bio-oil quality, and the changes during handling, and storage, provides the basis of the analytical protocolmore » presented in this paper. The method has most recently been used also for characterisation of bio-oil hydrotreatment products. Discussion on the use of gas chromatographic and spectroscopic methods is provided. In addition, fuel oil analyses suitable for fast pyrolysis bio-oils and hydrotreatment products are discussed.« less
An ultra-low power wireless sensor network for bicycle torque performance measurements.
Gharghan, Sadik K; Nordin, Rosdiadee; Ismail, Mahamod
2015-05-21
In this paper, we propose an energy-efficient transmission technique known as the sleep/wake algorithm for a bicycle torque sensor node. This paper aims to highlight the trade-off between energy efficiency and the communication range between the cyclist and coach. Two experiments were conducted. The first experiment utilised the Zigbee protocol (XBee S2), and the second experiment used the Advanced and Adaptive Network Technology (ANT) protocol based on the Nordic nRF24L01 radio transceiver chip. The current consumption of ANT was measured, simulated and compared with a torque sensor node that uses the XBee S2 protocol. In addition, an analytical model was derived to correlate the sensor node average current consumption with a crank arm cadence. The sensor node achieved 98% power savings for ANT relative to ZigBee when they were compared alone, and the power savings amounted to 30% when all components of the sensor node are considered. The achievable communication range was 65 and 50 m for ZigBee and ANT, respectively, during measurement on an outdoor cycling track (i.e., velodrome). The conclusions indicate that the ANT protocol is more suitable for use in a torque sensor node when power consumption is a crucial demand, whereas the ZigBee protocol is more convenient in ensuring data communication between cyclist and coach.
An Ultra-Low Power Wireless Sensor Network for Bicycle Torque Performance Measurements
Gharghan, Sadik K.; Nordin, Rosdiadee; Ismail, Mahamod
2015-01-01
In this paper, we propose an energy-efficient transmission technique known as the sleep/wake algorithm for a bicycle torque sensor node. This paper aims to highlight the trade-off between energy efficiency and the communication range between the cyclist and coach. Two experiments were conducted. The first experiment utilised the Zigbee protocol (XBee S2), and the second experiment used the Advanced and Adaptive Network Technology (ANT) protocol based on the Nordic nRF24L01 radio transceiver chip. The current consumption of ANT was measured, simulated and compared with a torque sensor node that uses the XBee S2 protocol. In addition, an analytical model was derived to correlate the sensor node average current consumption with a crank arm cadence. The sensor node achieved 98% power savings for ANT relative to ZigBee when they were compared alone, and the power savings amounted to 30% when all components of the sensor node are considered. The achievable communication range was 65 and 50 m for ZigBee and ANT, respectively, during measurement on an outdoor cycling track (i.e., velodrome). The conclusions indicate that the ANT protocol is more suitable for use in a torque sensor node when power consumption is a crucial demand, whereas the ZigBee protocol is more convenient in ensuring data communication between cyclist and coach. PMID:26007728
An analytical approach to test and design upper limb prosthesis.
Veer, Karan
2015-01-01
In this work the signal acquiring technique, the analysis models and the design protocols of the prosthesis are discussed. The different methods to estimate the motion intended by the amputee from surface electromyogram (SEMG) signals based on time and frequency domain parameters are presented. The experiment proposed that the used techniques can help significantly in discriminating the amputee's motions among four independent activities using dual channel set-up. Further, based on experimental results, the design and working of an artificial arm have been covered under two constituents--the electronics design and the mechanical assembly. Finally, the developed hand prosthesis allows the amputated persons to perform daily routine activities easily.
Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long
2015-01-01
The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and (131)I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. Copyright © 2014 Elsevier Ltd. All rights reserved.
An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska
Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.
2009-01-01
Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.
Ciceri, E; Recchia, S; Dossi, C; Yang, L; Sturgeon, R E
2008-01-15
The development and validation of a method for the determination of mercury in sediments using a sector field inductively coupled plasma mass spectrometer (SF-ICP-MS) for detection is described. The utilization of isotope dilution (ID) calibration is shown to solve analytical problems related to matrix composition. Mass bias is corrected using an internal mass bias correction technique, validated against the traditional standard bracketing method. The overall analytical protocol is validated against NRCC PACS-2 marine sediment CRM. The estimated limit of detection is 12ng/g. The proposed procedure was applied to the analysis of a real sediment core sampled to a depth of 160m in Lake Como, where Hg concentrations ranged from 66 to 750ng/g.
Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B
2017-09-01
Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 × g to a standard protocol of 10 min at 2200 × g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 × g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 × g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.
de Hoogt, Ronald; Estrada, Marta F; Vidic, Suzana; Davies, Emma J; Osswald, Annika; Barbier, Michael; Santo, Vítor E; Gjerde, Kjersti; van Zoggel, Hanneke J A A; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M; Verschuren, Emmy W; Hickman, John; Graeser, Ralph
2017-11-21
Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented.
de Hoogt, Ronald; Estrada, Marta F.; Vidic, Suzana; Davies, Emma J.; Osswald, Annika; Barbier, Michael; Santo, Vítor E.; Gjerde, Kjersti; van Zoggel, Hanneke J. A. A.; Blom, Sami; Dong, Meng; Närhi, Katja; Boghaert, Erwin; Brito, Catarina; Chong, Yolanda; Sommergruber, Wolfgang; van der Kuip, Heiko; van Weerden, Wytske M.; Verschuren, Emmy W.; Hickman, John; Graeser, Ralph
2017-01-01
Two-dimensional (2D) culture of cancer cells in vitro does not recapitulate the three-dimensional (3D) architecture, heterogeneity and complexity of human tumors. More representative models are required that better reflect key aspects of tumor biology. These are essential studies of cancer biology and immunology as well as for target validation and drug discovery. The Innovative Medicines Initiative (IMI) consortium PREDECT (www.predect.eu) characterized in vitro models of three solid tumor types with the goal to capture elements of tumor complexity and heterogeneity. 2D culture and 3D mono- and stromal co-cultures of increasing complexity, and precision-cut tumor slice models were established. Robust protocols for the generation of these platforms are described. Tissue microarrays were prepared from all the models, permitting immunohistochemical analysis of individual cells, capturing heterogeneity. 3D cultures were also characterized using image analysis. Detailed step-by-step protocols, exemplary datasets from the 2D, 3D, and slice models, and refined analytical methods were established and are presented. PMID:29160867
The international fine aerosol networks
NASA Astrophysics Data System (ADS)
Cahill, Thomas A.
1993-04-01
The adoption by the United States of a PIXE-based protocol for its fine aerosol network, after open competitions involving numerous laboratories and methods, has encouraged cooperation with other countries possessing similar capabilities and similar needs. These informal cooperative programs, involving about a dozen countries at the end of 1991, almost all use PIXE as a major component of the analytical protocols. The University of California, Davis, Air Quality Group assisted such programs through indefinite loans of a quality assurance sampler, the IMPROVE Channel A, and analyses at no cost of a small fraction of the samples taken in a side-by-side configuration. In December 1991, the World Meteorological Organization chose a protocol essentially identical to IMPROVE for the Global Atmospheric Watch (GAW) network and began deploying units, the IMPROVE Channel A, to sites around the world. Preferred analyses include fine (less than about 2.5 μm) mass, ions by ion chromatography and elements by PIXE + PESA (or, lacking that, XRF). This paper will describe progress in both programs, giving examples of the utility of the data and projecting the future expansion of the network to about 20 GAW sites by 1994.
Manns, David C; Mansfield, Anna Katharine
2012-08-17
Four high-throughput reverse-phase chromatographic protocols utilizing two different core-shell column chemistries have been developed to analyze the phenolic profiles of complex matrices, specifically targeting juices and wines produced from interspecific hybrid grape cultivars. Following pre-fractionation via solid-phase extraction or direct injection, individual protocols were designed to resolve, identify and quantify specific chemical classes of compounds including non-anthocyanin monomeric phenolics, condensed tannins following acid hydrolysis, and anthocyanins. Detection levels ranging from 1.2 ppb to 27.5 ppb, analyte %RSDs ranging from 0.04 to 0.38, and linear ranges of quantitation approaching five orders of magnitude were achieved using conventional HPLC instrumentation. Using C(18) column chemistry, the non-anthocyanin monomeric protocol effectively separated a set of 16 relevant phenolic compounds comprised flavan-3-ols, hydroxycinnamic acids, and flavonols in under 14 min. The same column was used to develop a 15-min protocol for hydrolyzed condensed tannin analysis. Two anthocyanin protocols are presented, one utilizing the same C(18) column, best suited for anthocyanidin and monoglucoside analysis, the other utilizing a pentafluorophenyl chemistry optimized to effectively separate complex mixtures of coexisting mono- and diglucoside anthocyanins. These protocols and column chemistries have been used initially to explore a wide variety of complex phenolic matrices, including red and white juices and wines produced from Vitis vinifera and interspecific hybrid grape cultivars, juices, teas, and plant extracts. Each protocol displayed robust matrix responses as written, yet are flexible enough to be easily modified to suit specifically tailored analytical requirements. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi
2016-06-01
A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.
2015-05-01
application ,1 while the simulated PLC software is the open source ModbusPal Java application . When queried using the Modbus TCP protocol, ModbusPal reports...and programmable logic controller ( PLC ) components. The HMI and PLC components were instantiated with software and installed in multiple virtual...creating and capturing HMI– PLC network traffic over a 24-h period in the virtualized network and inspect the packets for errors. Test the
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Dried Blood Spots - Preparing and Processing for Use in Immunoassays and in Molecular Techniques
Grüner, Nico; Stambouli, Oumaima; Ross, R. Stefan
2015-01-01
The idea of collecting blood on a paper card and subsequently using the dried blood spots (DBS) for diagnostic purposes originated a century ago. Since then, DBS testing for decades has remained predominantly focused on the diagnosis of infectious diseases especially in resource-limited settings or the systematic screening of newborns for inherited metabolic disorders and only recently have a variety of new and innovative DBS applications begun to emerge. For many years, pre-analytical variables were only inappropriately considered in the field of DBS testing and even today, with the exception of newborn screening, the entire pre-analytical phase, which comprises the preparation and processing of DBS for their final analysis has not been standardized. Given this background, a comprehensive step-by-step protocol, which covers al the essential phases, is proposed, i.e., collection of blood; preparation of blood spots; drying of blood spots; storage and transportation of DBS; elution of DBS, and finally analyses of DBS eluates. The effectiveness of this protocol was first evaluated with 1,762 coupled serum/DBS pairs for detecting markers of hepatitis B virus, hepatitis C virus, and human immunodeficiency virus infections on an automated analytical platform. In a second step, the protocol was utilized during a pilot study, which was conducted on active drug users in the German cities of Berlin and Essen. PMID:25867233
Characterizing Contamination and Assessing Exposure, Risk and Resilience
EPA supports its responders' ability to characterize site contamination by developing sampling protocols, sample preparation methods, and analytical methods for chemicals, biotoxins, microbial pathogens, and radiological agents.
Madry, Milena M; Kraemer, Thomas; Baumgartner, Markus R
2018-01-01
Hair analysis has been established as a prevalent tool for retrospective drug monitoring. In this study, different extraction solvents for the determination of drugs of abuse and pharmaceuticals in hair were evaluated for their efficiency. A pool of authentic hair from drug users was used for extraction experiments. Hair was pulverized and extracted in triplicate with seven different solvents in a one- or two-step extraction. Three one- (methanol, acetonitrile, and acetonitrile/water) and four two-step extractions (methanol two-fold, methanol and methanol/acetonitrile/formate buffer, methanol and methanol/formate buffer, and methanol and methanol/hydrochloric acid) were tested under accurately equal experimental conditions. The extracts were directly analyzed by liquid chromatography-tandem mass spectrometry for opiates/opioids, stimulants, ketamine, selected benzodiazepines, antidepressants, antipsychotics, and antihistamines using deuterated internal standards. For most analytes, a two-step extraction with methanol did not significantly improve the yield compared to a one-step extraction with methanol. Extraction with acetonitrile alone was least efficient for most analytes. Extraction yields of acetonitrile/water, methanol and methanol/acetonitrile/formate buffer, and methanol and methanol/formate buffer were significantly higher compared to methanol. Highest efficiencies were obtained by a two-step extraction with methanol and methanol/hydrochloric acid, particularly for morphine, 6-monoacetylmorphine, codeine, 6-acetylcodeine, MDMA, zopiclone, zolpidem, amitriptyline, nortriptyline, citalopram, and doxylamine. For some analytes (e.g., tramadol, fluoxetine, sertraline), all extraction solvents, except for acetonitrile, were comparably efficient. There was no significant correlation between extraction efficiency with an acidic solvent and the pka or log P of the analyte. However, there was a significant trend for the extraction efficiency with acetonitrile to the log P of the analyte. The study demonstrates that the choice of extraction solvent has a strong impact on hair analysis outcomes. Therefore, validation protocols should include the evaluation of extraction efficiency of drugs by using authentic rather than spiked hair. Different extraction procedures may contribute to the scatter of quantitative results in inter-laboratory comparisons. Harmonization of extraction protocols is recommended, when interpretation is based on same cut-off levels. Copyright © 2017 Elsevier B.V. All rights reserved.
IFSA: a microfluidic chip-platform for frit-based immunoassay protocols
NASA Astrophysics Data System (ADS)
Hlawatsch, Nadine; Bangert, Michael; Miethe, Peter; Becker, Holger; Gärtner, Claudia
2013-03-01
Point-of-care diagnostics (POC) is one of the key application fields for lab-on-a-chip devices. While in recent years much of the work has concentrated on integrating complex molecular diagnostic assays onto a microfluidic device, there is a need to also put comparatively simple immunoassay-type protocols on a microfluidic platform. In this paper, we present the development of a microfluidic cartridge using an immunofiltration approach. In this method, the sandwich immunoassay takes place in a porous frit on which the antibodies have immobilized. The device is designed to be able to handle three samples in parallel and up to four analytical targets per sample. In order to meet the critical cost targets for the diagnostic market, the microfluidic chip has been designed and manufactured using high-volume manufacturing technologies in mind. Validation experiments show comparable sensitivities in comparison with conventional immunofiltration kits.
Numerical approach for unstructured quantum key distribution
Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert
2016-01-01
Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739
Mass Spectrometry Based Lipidomics: An Overview of Technological Platforms
Köfeler, Harald C.; Fauland, Alexander; Rechberger, Gerald N.; Trötzmüller, Martin
2012-01-01
One decade after the genomic and the proteomic life science revolution, new ‘omics’ fields are emerging. The metabolome encompasses the entity of small molecules—Most often end products of a catalytic process regulated by genes and proteins—with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like ‘shotgun lipidomics’, liquid chromatography—Mass spectrometry (LC-MS) and matrix assisted laser desorption ionization-time of flight (MALDI-TOF) based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most ‘omics’ workflows. PMID:24957366
Mass spectrometry based lipidomics: an overview of technological platforms.
Köfeler, Harald C; Fauland, Alexander; Rechberger, Gerald N; Trötzmüller, Martin
2012-01-05
One decade after the genomic and the proteomic life science revolution, new 'omics' fields are emerging. The metabolome encompasses the entity of small molecules-Most often end products of a catalytic process regulated by genes and proteins-with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like 'shotgun lipidomics', liquid chromatography-Mass spectrometry (LC-MS) and matrix assisted laser desorption ionization-time of flight (MALDI-TOF) based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most 'omics' workflows.
Campi-Azevedo, Ana Carolina; Peruhype-Magalhães, Vanessa; Coelho-Dos-Reis, Jordana Grazziela; Costa-Pereira, Christiane; Yamamura, Anna Yoshida; Lima, Sheila Maria Barbosa de; Simões, Marisol; Campos, Fernanda Magalhães Freire; de Castro Zacche Tonini, Aline; Lemos, Elenice Moreira; Brum, Ricardo Cristiano; de Noronha, Tatiana Guimarães; Freire, Marcos Silva; Maia, Maria de Lourdes Sousa; Camacho, Luiz Antônio Bastos; Rios, Maria; Chancey, Caren; Romano, Alessandro; Domingues, Carla Magda; Teixeira-Carvalho, Andréa; Martins-Filho, Olindo Assis
2017-09-01
Technological innovations in vaccinology have recently contributed to bring about novel insights for the vaccine-induced immune response. While the current protocols that use peripheral blood samples may provide abundant data, a range of distinct components of whole blood samples are required and the different anticoagulant systems employed may impair some properties of the biological sample and interfere with functional assays. Although the interference of heparin in functional assays for viral neutralizing antibodies such as the functional plaque-reduction neutralization test (PRNT), considered the gold-standard method to assess and monitor the protective immunity induced by the Yellow fever virus (YFV) vaccine, has been well characterized, the development of pre-analytical treatments is still required for the establishment of optimized protocols. The present study intended to optimize and evaluate the performance of pre-analytical treatment of heparin-collected blood samples with ecteola-cellulose (ECT) to provide accurate measurement of anti-YFV neutralizing antibodies, by PRNT. The study was designed in three steps, including: I. Problem statement; II. Pre-analytical steps; III. Analytical steps. Data confirmed the interference of heparin on PRNT reactivity in a dose-responsive fashion. Distinct sets of conditions for ECT pre-treatment were tested to optimize the heparin removal. The optimized protocol was pre-validated to determine the effectiveness of heparin plasma:ECT treatment to restore the PRNT titers as compared to serum samples. The validation and comparative performance was carried out by using a large range of serum vs heparin plasma:ECT 1:2 paired samples obtained from unvaccinated and 17DD-YFV primary vaccinated subjects. Altogether, the findings support the use of heparin plasma:ECT samples for accurate measurement of anti-YFV neutralizing antibodies. Copyright © 2017 Elsevier B.V. All rights reserved.
Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G
2010-09-01
This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).
Borai, Anwar; Ichihara, Kiyoshi; Al Masaud, Abdulaziz; Tamimi, Waleed; Bahijri, Suhad; Armbuster, David; Bawazeer, Ali; Nawajha, Mustafa; Otaibi, Nawaf; Khalil, Haitham; Kawano, Reo; Kaddam, Ibrahim; Abdelaal, Mohamed
2016-05-01
This study is a part of the IFCC-global study to derive reference intervals (RIs) for 28 chemistry analytes in Saudis. Healthy individuals (n=826) aged ≥18 years were recruited using the global study protocol. All specimens were measured using an Architect analyzer. RIs were derived by both parametric and non-parametric methods for comparative purpose. The need for secondary exclusion of reference values based on latent abnormal values exclusion (LAVE) method was examined. The magnitude of variation attributable to gender, ages and regions was calculated by the standard deviation ratio (SDR). Sources of variations: age, BMI, physical exercise and smoking levels were investigated by using the multiple regression analysis. SDRs for gender, age and regional differences were significant for 14, 8 and 2 analytes, respectively. BMI-related changes in test results were noted conspicuously for CRP. For some metabolic related parameters the ranges of RIs by non-parametric method were wider than by the parametric method and RIs derived using the LAVE method were significantly different than those without it. RIs were derived with and without gender partition (BMI, drugs and supplements were considered). RIs applicable to Saudis were established for the majority of chemistry analytes, whereas gender, regional and age RI partitioning was required for some analytes. The elevated upper limits of metabolic analytes reflects the existence of high prevalence of metabolic syndrome in Saudi population.
Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L
2004-11-15
This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.
Delefortrie, Quentin; Schatt, Patricia; Grimmelprez, Alexandre; Gohy, Patrick; Deltour, Didier; Collard, Geneviève; Vankerkhoven, Patrick
2016-02-01
Although colonoscopy associated with histopathological sampling remains the gold standard in the diagnostic and follow-up of inflammatory bowel disease (IBD), calprotectin is becoming an essential biomarker in gastroenterology. The aim of this work is to compare a newly developed kit (Liaison® Calprotectin - Diasorin®) and its two distinct extraction protocols (weighing and extraction device protocol) with a well established point of care test (Quantum Blue® - Bühlmann-Alere®) in terms of analytical performances and ability to detect relapses amongst a Crohn's population in follow-up. Stool specimens were collected over a six month period and were composed of control and Crohn's patients. Amongst the Crohn's population disease activity (active vs quiescent) was evaluated by gastroenterologists. A significant difference was found between all three procedures in terms of calprotectin measurements (weighing protocol=30.3μg/g (median); stool extraction device protocol=36.9μg/g (median); Quantum Blue® (median)=63; Friedman test, P value=0.05). However, a good correlation was found between both extraction methods coupled with the Liaison® analyzer and between the Quantum Blue® (weighing protocol/extraction device protocol Rs=0.844, P=0.01; Quantum Blue®/extraction device protocol Rs=0.708, P=0.01; Quantum Blue®/weighing protocol, Rs=0.808, P=0.01). Finally, optimal cut-offs (and associated negative predictive values - NPV) for detecting relapses were in accordance with above results (Quantum Blue® 183.5μg/g and NPV of 100%>extraction device protocol+Liaison® analyzer 124.5μg/g and NPV of 93.5%>weighing protocol+Liaison® analyzer 106.5μg/g and NPV of 95%). Although all three methods correlated well and had relatively good NPV in terms of detecting relapses amongst a Crohn's population in follow-up, the lack of any international standard is the origin of different optimal cut-offs between the three procedures. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Lippi, Giuseppe; Montagnana, Martina; Giavarina, Davide
2006-01-01
Owing to remarkable advances in automation, laboratory technology and informatics, the pre-analytical phase has become the major source of variability in laboratory testing. The present survey investigated the development of several pre-analytical processes within a representative cohort of Italian clinical laboratories. A seven-point questionnaire was designed to investigate the following issues: 1a) the mean outpatient waiting time before check-in and 1b) the mean time from check-in to sample collection; 2) the mean time from sample collection to analysis; 3) the type of specimen collected for clinical chemistry testing; 4) the degree of pre-analytical automation; 5a) the number of samples shipped to other laboratories and 5b) the availability of standardised protocols for transportation; 6) the conditions for specimen storage; and 7) the availability and type of guidelines for management of unsuitable specimens. The questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. 107 questionnaires (71.3%) were returned. Data analysis revealed a high degree of variability among laboratories for the time required for check-in, outpatient sampling, sample transportation to the referral laboratory and analysis upon the arrival. Only 31% of laboratories have automated some pre-analytical steps. Of the 87% of laboratories that ship specimens to other facilities without sample preparation, 19% have no standardised protocol for transportation. For conventional clinical chemistry testing, 74% of the laboratories use serum evacuated tubes (59% with and 15% without serum separator), whereas the remaining 26% use lithium-heparin evacuated tubes (11% with and 15% without plasma separator). The storage period and conditions for rerun/retest vary widely. Only 63% of laboratories have a codified procedure for the management of unsuitable specimens, which are recognised by visual inspection (69%) or automatic detection (29%). Only 56% of the laboratories have standardised procedures for the management of unsuitable specimens, which vary widely on a local basis. The survey highlights broad heterogeneity in several pre-analytical processes among Italian laboratories. The lack of reliable guidelines encompassing evidence-based practice is a major problem for the standardisation of this crucial part of the testing process and represents a major challenge for laboratory medicine in the 2000s.
How do gut feelings feature in tutorial dialogues on diagnostic reasoning in GP traineeship?
Stolper, C F; Van de Wiel, M W J; Hendriks, R H M; Van Royen, P; Van Bokhoven, M A; Van der Weijden, T; Dinant, G J
2015-05-01
Diagnostic reasoning is considered to be based on the interaction between analytical and non-analytical cognitive processes. Gut feelings, a specific form of non-analytical reasoning, play a substantial role in diagnostic reasoning by general practitioners (GPs) and may activate analytical reasoning. In GP traineeships in the Netherlands, trainees mostly see patients alone but regularly consult with their supervisors to discuss patients and problems, receive feedback, and improve their competencies. In the present study, we examined the discussions of supervisors and their trainees about diagnostic reasoning in these so-called tutorial dialogues and how gut feelings feature in these discussions. 17 tutorial dialogues focussing on diagnostic reasoning were video-recorded and transcribed and the protocols were analysed using a detailed bottom-up and iterative content analysis and coding procedure. The dialogues were segmented into quotes. Each quote received a content code and a participant code. The number of words per code was used as a unit of analysis to quantitatively compare the contributions to the dialogues made by supervisors and trainees, and the attention given to different topics. The dialogues were usually analytical reflections on a trainee's diagnostic reasoning. A hypothetico-deductive strategy was often used, by listing differential diagnoses and discussing what information guided the reasoning process and might confirm or exclude provisional hypotheses. Gut feelings were discussed in seven dialogues. They were used as a tool in diagnostic reasoning, inducing analytical reflection, sometimes on the entire diagnostic reasoning process. The emphasis in these tutorial dialogues was on analytical components of diagnostic reasoning. Discussing gut feelings in tutorial dialogues seems to be a good educational method to familiarize trainees with non-analytical reasoning. Supervisors need specialised knowledge about these aspects of diagnostic reasoning and how to deal with them in medical education.
Escuder-Gilabert, L; Ruiz-Roch, D; Villanueva-Camañas, R M; Medina-Hernández, M J; Sagrado, S
2004-03-12
In the present paper, the simultaneous quantification of two analytes showing strongly overlapped chromatographic peaks (alpha = 1.02), under the assumption that both available equipment and training of the laboratory staff are basic, is studied. A pharmaceutical preparation (Mutabase) containing two drugs of similar physicochemical properties (amitriptyline and perphenazine) is selected as case of study. The assays are carried out under realistic working conditions (i.e. routine testing laboratories). Uncertainty considerations are introduced in the study. A partial least squares model is directly applied to the chromatographic data (with no previous signal transformation) to perform quality control of the pharmaceutical formulation. Under the adequate protocol, the relative error in prediction of analytes is within the tolerances found in the pharmacopeia (10%). For spiked samples simulating formulation mistakes, the errors found have the same magnitude and sign to those provoked.
Urinary cell-free DNA is a versatile analyte for monitoring infections of the urinary tract.
Burnham, Philip; Dadhania, Darshana; Heyang, Michael; Chen, Fanny; Westblade, Lars F; Suthanthiran, Manikkam; Lee, John Richard; De Vlaminck, Iwijn
2018-06-20
Urinary tract infections are one of the most common infections in humans. Here we tested the utility of urinary cell-free DNA (cfDNA) to comprehensively monitor host and pathogen dynamics in bacterial and viral urinary tract infections. We isolated cfDNA from 141 urine samples from a cohort of 82 kidney transplant recipients and performed next-generation sequencing. We found that urinary cfDNA is highly informative about bacterial and viral composition of the microbiome, antimicrobial susceptibility, bacterial growth dynamics, kidney allograft injury, and host response to infection. These different layers of information are accessible from a single assay and individually agree with corresponding clinical tests based on quantitative PCR, conventional bacterial culture, and urinalysis. In addition, cfDNA reveals the frequent occurrence of pathologies that remain undiagnosed with conventional diagnostic protocols. Our work identifies urinary cfDNA as a highly versatile analyte to monitor infections of the urinary tract.
Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-05-01
To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
CSMA/RN: A universal protocol for gigabit networks
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, Kurt J.; Overstreet, C. Michael; Khanna, S.; Paterra, Frank
1990-01-01
Networks must provide intelligent access for nodes to share the communications resources. In the range of 100 Mbps to 1 Gbps, the demand access class of protocols were studied extensively. Many use some form of slot or reservation system and many the concept of attempt and defer to determine the presence or absence of incoming information. The random access class of protocols like shared channel systems (Ethernet), also use the concept of attempt and defer in the form of carrier sensing to alleviate the damaging effects of collisions. In CSMA/CD, the sensing of interference is on a global basis. All systems discussed above have one aspect in common, they examine activity on the network either locally or globally and react in an attempt and whatever mechanism. Of the attempt + mechanisms discussed, one is obviously missing; that is attempt and truncate. Attempt and truncate was studied in a ring configuration called the Carrier Sensed Multiple Access Ring Network (CSMA/RN). The system features of CSMA/RN are described including a discussion of the node operations for inserting and removing messages and for handling integrated traffic. The performance and operational features based on analytical and simulation studies which indicate that CSMA/RN is a useful and adaptable protocol over a wide range of network conditions are discussed. Finally, the research and development activities necessary to demonstrate and realize the potential of CSMA/RN as a universal, gigabit network protocol is outlined.
Role of optimization in the human dynamics of task execution
NASA Astrophysics Data System (ADS)
Cajueiro, Daniel O.; Maldonado, Wilfredo L.
2008-03-01
In order to explain the empirical evidence that the dynamics of human activity may not be well modeled by Poisson processes, a model based on queuing processes was built in the literature [A. L. Barabasi, Nature (London) 435, 207 (2005)]. The main assumption behind that model is that people execute their tasks based on a protocol that first executes the high priority item. In this context, the purpose of this paper is to analyze the validity of that hypothesis assuming that people are rational agents that make their decisions in order to minimize the cost of keeping nonexecuted tasks on the list. Therefore, we build and analytically solve a dynamic programming model with two priority types of tasks and show that the validity of this hypothesis depends strongly on the structure of the instantaneous costs that a person has to face if a given task is kept on the list for more than one period. Moreover, one interesting finding is that in one of the situations the protocol used to execute the tasks generates complex one-dimensional dynamics.
Monowar, Muhammad Mostafa; Hassan, Mohammad Mehedi; Bajaber, Fuad; Al-Hussein, Musaed; Alamri, Atif
2012-01-01
The emergence of heterogeneous applications with diverse requirements for resource-constrained Wireless Body Area Networks (WBANs) poses significant challenges for provisioning Quality of Service (QoS) with multi-constraints (delay and reliability) while preserving energy efficiency. To address such challenges, this paper proposes McMAC, a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes in WBANs. McMAC classifies traffic based on their multi-constrained QoS demands and introduces a novel superframe structure based on the “transmit-whenever-appropriate” principle, which allows diverse periods for diverse traffic classes according to their respective QoS requirements. Furthermore, a novel emergency packet handling mechanism is proposed to ensure packet delivery with the least possible delay and the highest reliability. McMAC is also modeled analytically, and extensive simulations were performed to evaluate its performance. The results reveal that McMAC achieves the desired delay and reliability guarantee according to the requirements of a particular traffic class while achieving energy efficiency. PMID:23202224
Márta, Zoltán; Bobály, Balázs; Fekete, Jenő; Magda, Balázs; Imre, Tímea; Mészáros, Katalin Viola; Bálint, Mária; Szabó, Pál Tamás
2018-02-20
Ultratrace analysis of sample components requires excellent analytical performance in terms of limits of quantitation (LOQ). Micro UHPLC coupled to sensitive tandem mass spectrometry provides state of the art solution for such analytical problems. Using on-line SPE with column switching on a micro UHPLC-MS/MS system allowed to decrease LOQ without any complex sample preparation protocol. The presented method is capable of reaching satisfactory low LOQ values for analysis of thirteen different steroid molecules from human plasma without the most commonly used off-line SPE or compound derivatization. Steroids were determined by using two simple sample preparation methods, based on lower and higher plasma steroid concentrations. In the first method, higher analyte concentrations were directly determined after protein precipitation with methanol. The organic phase obtained from the precipitation was diluted with water and directly injected into the LC-MS system. In the second method, low steroid levels were determined by concentrating the organic phase after steroid extraction. In this case, analytes were extracted with ethyl acetate and reconstituted in 90/10 water/acetonitrile following evaporation to dryness. This step provided much lower LOQs, outperforming previously published values. The method has been validated and subsequently applied to clinical laboratory measurement. Copyright © 2017 Elsevier B.V. All rights reserved.
Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.
Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J
2014-12-02
The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.
Quantification of 5'-deoxy-5'-methylthioadenosine in heat-treated natural rubber latex serum.
Pitakpornpreecha, Thanawat; Plubrukarn, Anuchit; Wititsuwannakul, Rapepun
2012-01-01
5'-Deoxy-5'-methylthioadenosine (MTA) is one of the biologically active components found in natural rubber latex (NRL) serum, a common waste product from rubber plantations. In this study the contents of MTA in heat-treated NRL serum were measured in order to assess the potential of the serum as an alternative source of MTA. To devise an HPLC/UV-based quantitative analytical protocol for the determination of MTA, and to determine the effect of heat treatment on the content of MTA in NRL serum from various sources. An HPLC/UV-based determination of MTA using an acidic eluant was devised and validated. In the heat treatment, the effect of refluxing times on MTA liberation was evaluated. The quantification protocol was validated with satisfying linearity, limits of detection and quantitation, precisions for peak areas and recovery percentages from intra- and inter-day operations. The amounts of MTA in the NRL sera from various sources increased with heat treatment to yield 5-12 μg MTA/mL of serum. The devised protocol was found to be satisfyingly applicable to the routine determination of MTA in NRL serum. The effect of heat treatment on the content of MTA also indicated another possible use for NRL serum, normally discarded in vast amounts by the rubber industry, as an alternative source of MTA. Copyright © 2011 John Wiley & Sons, Ltd.
Pazo, Daniel Y; Moliere, Fallon; Sampson, Maureen M; Reese, Christopher M; Agnew-Heard, Kimberly A; Walters, Matthew J; Holman, Matthew R; Blount, Benjamin C; Watson, Clifford H; Chambers, David M
2016-09-01
A significant portion of the increased risk of cancer and respiratory disease from exposure to cigarette smoke is attributed to volatile organic compounds (VOCs). In this study, 21 VOCs were quantified in mainstream cigarette smoke from 50U.S. domestic brand varieties that included high market share brands and 2 Kentucky research cigarettes (3R4F and 1R5F). Mainstream smoke was generated under ISO 3308 and Canadian Intense (CI) smoking protocols with linear smoking machines with a gas sampling bag collection followed by solid phase microextraction/gas chromatography/mass spectrometry (SPME/GC/MS) analysis. For both protocols, mainstream smoke VOC amounts among the different brand varieties were strongly correlated between the majority of the analytes. Overall, Pearson correlation (r) ranged from 0.68 to 0.99 for ISO and 0.36 to 0.95 for CI. However, monoaromatic compounds were found to increase disproportionately compared to unsaturated, nitro, and carbonyl compounds under the CI smoking protocol where filter ventilation is blocked. Overall, machine generated "vapor phase" amounts (µg/cigarette) are primarily attributed to smoking protocol (e.g., blocking of vent holes, puff volume, and puff duration) and filter ventilation. A possible cause for the disproportionate increase in monoaromatic compounds could be increased pyrolysis under low oxygen conditions associated with the CI protocol. This is the most comprehensive assessment of volatile organic compounds (VOCs) in cigarette smoke to date, encompassing 21 toxic VOCs, 50 different cigarette brand varieties, and 2 different machine smoking protocols (ISO and CI). For most analytes relative proportions remain consistent among U.S. cigarette brand varieties regardless of smoking protocol, however the CI smoking protocol did cause up to a factor of 6 increase in the proportion of monoaromatic compounds. This study serves as a basis to assess VOC exposure as cigarette smoke is a principle source of overall population-level VOC exposure in the United States. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio
2014-07-04
In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.
de Moraes, F M; Espósito, D L A; Klein, T M; da Fonseca, B A L
2018-01-01
Clinical manifestations of Zika, dengue, and chikungunya virus infections are very similar, making it difficult to reach a diagnosis based only on clinical grounds. In addition, there is an intense cross-reactivity between antibodies directed to Zika virus and other flaviviruses, and an accurate Zika diagnosis is best achieved by real-time RT-PCR. However, some real-time RT-PCR show better performance than others. To reach the best possible Zika diagnosis, the analytic sensitivity of some probe-based real-time RT-PCR amplifying Zika virus RNA was evaluated in spiked and clinical samples. We evaluated primers and probes to detect Zika virus, which had been published before, and tested sensitivity using serum spiked and patient samples by real-time RT-PCR. When tested against spiked samples, the previously described primers showed different sensitivity, with very similar results when samples from patients (serum and urine) were analyzed. Real-time RT-PCR designed to amplify Zika virus NS1 showed the best analytical sensitivity for all samples.
Extending the Kerberos Protocol for Distributed Data as a Service
2012-09-20
exported as a UIMA [11] PEAR file for deployment to IBM Content Analytics (ICA). A UIMA PEAR file is a deployable text analytics “pipeline” (analogous...to a web application packaged in a WAR file). ICA is a text analysis and search application that supports UIMA . The key entities targeted by NLP rules...workbench. [Online]. Available: https: //www.ibm.com/developerworks/community/alphaworks/lrw/ [11] Apache UIMA . [Online]. Available: http
Method and platform standardization in MRM-based quantitative plasma proteomics.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H
2013-12-16
There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Entanglement distillation protocols and number theory
NASA Astrophysics Data System (ADS)
Bombin, H.; Martin-Delgado, M. A.
2005-09-01
We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.
Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M
2013-08-30
Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.
Method 1200: Analytical Protocol for Non-Typhoidal Salmonella in Drinking Water and Surface Water
Method 1200 is used for identification, confirmation and quantitation of non-typhoidal Salmonella in water samples, using selective and non-selective media followed by biochemical and serological confirmation.
Kumar, B Vinodh; Mohan, Thuthi
2018-01-01
Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.
Bogdanovska-Todorovska, Magdalena; Petrushevska, Gordana; Janevska, Vesna; Spasevska, Liljana; Kostadinova-Kunovska, Slavica
2018-05-20
Accurate assessment of human epidermal growth factor receptor 2 (HER-2) is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH). Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE) tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples) and DAKO Histology FISH Accessory Kit (30 samples). The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing) affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70); the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p < 0.0001). The greatest discordance (82%) between IHC and FISH was observed in IHC 2+ group. A standardized FISH protocol for HER-2 assessment, with high hybridization efficiency, is necessary due to variability in tissue processing and individual tissue characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.
Experimental control in software reliability certification
NASA Technical Reports Server (NTRS)
Trammell, Carmen J.; Poore, Jesse H.
1994-01-01
There is growing interest in software 'certification', i.e., confirmation that software has performed satisfactorily under a defined certification protocol. Regulatory agencies, customers, and prospective reusers all want assurance that a defined product standard has been met. In other industries, products are typically certified under protocols in which random samples of the product are drawn, tests characteristic of operational use are applied, analytical or statistical inferences are made, and products meeting a standard are 'certified' as fit for use. A warranty statement is often issued upon satisfactory completion of a certification protocol. This paper outlines specific engineering practices that must be used to preserve the validity of the statistical certification testing protocol. The assumptions associated with a statistical experiment are given, and their implications for statistical testing of software are described.
Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H
2018-01-30
Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy
2017-08-15
The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe
2014-09-01
Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.
A communal catalogue reveals Earth's multiscale microbial diversity.
Thompson, Luke R; Sanders, Jon G; McDonald, Daniel; Amir, Amnon; Ladau, Joshua; Locey, Kenneth J; Prill, Robert J; Tripathi, Anupriya; Gibbons, Sean M; Ackermann, Gail; Navas-Molina, Jose A; Janssen, Stefan; Kopylova, Evguenia; Vázquez-Baeza, Yoshiki; González, Antonio; Morton, James T; Mirarab, Siavash; Zech Xu, Zhenjiang; Jiang, Lingjing; Haroon, Mohamed F; Kanbar, Jad; Zhu, Qiyun; Jin Song, Se; Kosciolek, Tomasz; Bokulich, Nicholas A; Lefler, Joshua; Brislawn, Colin J; Humphrey, Gregory; Owens, Sarah M; Hampton-Marcell, Jarrad; Berg-Lyons, Donna; McKenzie, Valerie; Fierer, Noah; Fuhrman, Jed A; Clauset, Aaron; Stevens, Rick L; Shade, Ashley; Pollard, Katherine S; Goodwin, Kelly D; Jansson, Janet K; Gilbert, Jack A; Knight, Rob
2017-11-23
Our growing awareness of the microbial world's importance and diversity contrasts starkly with our limited understanding of its fundamental structure. Despite recent advances in DNA sequencing, a lack of standardized protocols and common analytical frameworks impedes comparisons among studies, hindering the development of global inferences about microbial life on Earth. Here we present a meta-analysis of microbial community samples collected by hundreds of researchers for the Earth Microbiome Project. Coordinated protocols and new analytical methods, particularly the use of exact sequences instead of clustered operational taxonomic units, enable bacterial and archaeal ribosomal RNA gene sequences to be followed across multiple studies and allow us to explore patterns of diversity at an unprecedented scale. The result is both a reference database giving global context to DNA sequence data and a framework for incorporating data from future studies, fostering increasingly complete characterization of Earth's microbial diversity.
Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.
Ullah, Sana; Chen, Min; Kwak, Kyung Sup
2012-12-01
The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Somasundaran
The aim of the project is to develop a knowledge base to help the design of enhanced processes for mobilizing and extracting untrapped oil. We emphasize evaluation of novel surfactant mixtures and obtaining optimum combinations of the surfactants for efficient chemical flooding EOR processes. In this regard, an understanding of the aggregate shape, size and structure is crucial since these properties govern the crude oil removal efficiency. During the three-year period, the adsorption and aggregation behavior of sugar-based surfactants and their mixtures with other types of surfactants have been studied. Sugar-based surfactants are made from renewable resources, nontoxic and biodegradable.more » They are miscible with water and oil. These environmentally benign surfactants feature high surface activity, good salinity, calcium and temperature tolerance, and unique adsorption behavior. They possess the characteristics required for oil flooding surfactants and have the potential for replacing currently used surfactants in oil recovery. A novel analytical ultracentrifugation technique has been successfully employed for the first time, to characterize the aggregate species present in mixed micellar solution due to its powerful ability to separate particles based on their size and shape and monitor them simultaneously. Analytical ultracentrifugation offers an unprecedented opportunity to obtain important information on mixed micelles, structure-performance relationship for different surfactant aggregates in solution and their role in interfacial processes. Initial sedimentation velocity investigations were conducted using nonyl phenol ethoxylated decyl ether (NP-10) to choose the best analytical protocol, calculate the partial specific volume and obtain information on sedimentation coefficient, aggregation mass of micelles. Four softwares: OptimaTM XL-A/XL-I data analysis software, DCDT+, Svedberg and SEDFIT, were compared for the analysis of sedimentation velocity experimental data. The results have been compared to that from Light Scattering. Based on the tests, Svedberg and SEDFIT analysis were chosen for further studies.« less
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin
2017-01-01
Abstract Introduction Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono‐substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry‐based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. Objective To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Method Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Results Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. Conclusions DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence‐based identification are necessary before DNA‐based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. PMID:28906059
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication.
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin; de Boer, Hugo
2018-03-01
Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono-substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry-based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence-based identification are necessary before DNA-based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd.
Ivanova, Bojidarka; Spiteller, Michael
2012-10-01
This paper reports the qualitative and quantitative analysis (QA) of mixtures of hallucinogens, N,N-dimethyltryptamine (DMT) (1), 5-methoxy- (la) and 5-hydroxy-N,N-dimethyltryptamine (1b) in the presence of beta-carbolines (indole alkaloids of type XII) ((2), (3) and (5)}. The validated electronic absorption spectroscopic (EAs) protocol achieved a concentration limit of detection (LOD) of 7.2.10(-7) mol/L {concentration limit of quantification (LOQ) of 24.10(-7) mol/L) using bands (lambda max within 260+/-0.23-262+/-0.33 nm. Metrology, including accuracy, measurement repeatability, measurement precision, trueness of measurement, and reproducibility of the measurements are presented using N,N-dimethyltryptamine (DMA) as standard. The analytical quantities of mixtures of alkaloids 4, 6 and 7 are: lambda max 317+/-0.45, 338+/-0.69 and 430+/-0.09 for 4 (LOD, 8.6.10(-7) mol/L; LOQ, 28.66(6), mol/L), as well as 528+/-0.75 nm for 6 and 7 (LOD, 8.2.10(-7) mol/L; LOQ, 27.33(3), mol/L), respectively. The partially validated protocols by high performance liquid chromatography (HPLC), electrospray ionization (ESI), mass spectrometry (MS), both in single and tandem operation (MS/MS) mode, as well as matrix/assisted laser desorption/ionization (MALDI) MS are elaborated. The Raman spectroscopic (RS) protocol for analysis of psychoactive substances, characterized by strong fluorescence RS profile was developed, with the detection limits being discussed. The known synergistic effect leading to increase the psychoactive and hallucinogenic properties and the reported acute poisoning cases from 1-7, make the present study emergent, since as well the current lack of analytical data and the herein metrology obtained contributed to the elaboration of highly selective and precise analytical protocols, which would be of interest in the field of criminal forensic analysis.
NASA Astrophysics Data System (ADS)
Lark, R. M.; Rawlins, B. G.; Lark, T. A.
2014-05-01
The LUCAS Topsoil survey is a pan-European Union initiative in which soil data were collected according to standard protocols from 19 967 sites. Any inference about soil variables is subject to uncertainty due to different sources of variability in the data. In this study we examine the likely magnitude of uncertainty due to the field-sampling protocol. The published sampling protocol (LUCAS, 2009) describes a procedure to form a composite soil sample from aliquots collected to a depth of between approximately 15-20. A v-shaped hole to the target depth is cut with a spade, then a slice is cut from one of the exposed surfaces. This methodology gives rather less control of the sampling depth than protocols used in other soil and geochemical surveys, this may be a substantial source of variation in uncultivated soils with strong contrasts between an organic-rich A-horizon and an underlying B-horizon. We extracted all representative profile descriptions from soil series recorded in the memoir of the 1:250 000-scale map of Northern England (Soil Survey of England and Wales, 1984) where the base of the A-horizon is less than 20 cm below the surface. The Soil Associations in which these 14 series are significant members cover approximately 17% of the area of Northern England, and are expected to be the mineral soils with the largest organic content. Soil Organic Carbon content and bulk density were extracted for the A- and B-horizons, along with the thickness of the horizons. Recorded bulk density, or prediction by a pedotransfer function, were also recorded. For any proposed angle of the v-shaped hole, the proportions of A- and B-horizon in the resulting sample may be computed by trigonometry. From the bulk density and SOC concentration of the horizons, the SOC concentration of the sample can be computed. For each Soil Series we drew 1000 random samples from a trapezoidal distribution of angles, with uniform density over the range corresponding to depths 15-20 cm and zero density for angles corresponding to depths larger than 21 cm or less than 14 cm. We computed the corresponding variance of sample SOC contents. We found that the variance in SOC determinations attributable to variation in sample depth for these uncultivated soils was of the same order of magnitude as the estimate of the subsampling + analytical variance component (both on a log scale) that we previously computed for soils in the UK (Rawlins et al., 2009). It seems unnecessary to accept this source of uncertainty, given the effort undertaken to reduce the analytical variation which is no larger (and often smaller) than this variation due to the field protocol. If pan-European soil monitoring is to be based on the LUCAS Topsoil survey, as suggested by an initial report, uncertainty could be reduced if the sampling depth was specified to a unique depth, rather than the current depth range. LUCAS. 2009. Instructions for Surveyors. Technical reference document C-1: General implementation, Land Cover and Use, Water management, Soil, Transect, Photos. European Commission, Eurostat. Rawlins, B.G., Scheib, A.J., Lark, R.M. & Lister, T.R. 2009. Sampling and analytical plus subsampling variance components for five soil indicators observed at regional scale. European Journal of Soil Science 60, 740-747
NASA Astrophysics Data System (ADS)
Pleros, N.; Kalfas, G.; Mitsolidou, C.; Vagionas, C.; Tsiokos, D.; Miliou, A.
2017-01-01
Future broadband access networks in the 5G framework will need to be bilateral, exploiting both optical and wireless technologies. This paper deals with new approaches and synergies on radio-over-fiber (RoF) technologies and how those can be leveraged to seamlessly converge wireless technology for agility and mobility with passive optical networks (PON)-based backhauling. The proposed convergence paradigm is based upon a holistic network architecture mixing mm-wave wireless access with photonic integration, dynamic capacity allocation and network coding schemes to enable high bandwidth and low-latency fixed and 60GHz wireless personal area communications for gigabit rate per user, proposing and deploying on top a Medium-Transparent MAC (MT-MAC) protocol as a low-latency bandwidth allocation mechanism. We have evaluated alternative network topologies between the central office (CO) and the access point module (APM) for data rates up to 2.5 Gb/s and SC frequencies up to 60 GHz. Optical network coding is demonstrated for SCM-based signaling to enhance bandwidth utilization and facilitate optical-wireless convergence in 5G applications, reporting medium-transparent network coding directly at the physical layer between end-users communicating over a RoF infrastructure. Towards equipping the physical layer with the appropriate agility to support MT-MAC protocols, a monolithic InP-based Remote Antenna Unit optoelectronic PIC interface is shown that ensures control over the optical resource allocation assisting at the same time broadband wireless service. Finally, the MT-MAC protocol is analysed and simulation and analytical theoretical results are presented that are found to be in good agreement confirming latency values lower than 1msec for small- to mid-load conditions.
Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J
2011-11-01
This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.
Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette
2018-05-10
Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safigholi, H; Soliman, A; Song, W
Purpose: Brachytherapy treatment planning systems based on TG-43 protocol calculate the dose in water and neglects the heterogeneity effect of seeds in multi-seed implant brachytherapy. In this research, the accuracy of a novel analytical model that we propose for the inter-seed attenuation effect (ISA) for 103-Pd seed model is evaluated. Methods: In the analytical model, dose perturbation due to the ISA effect for each seed in an LDR multi-seed implant for 103-Pd is calculated by assuming that the seed of interest is active and the other surrounding seeds are inactive. The cumulative dosimetric effect of all seeds is then summedmore » using the superposition principle. The model is based on pre Monte Carlo (MC) simulated 3D kernels of the dose perturbations caused by the ISA effect. The cumulative ISA effect due to multiple surrounding seeds is obtained by a simple multiplication of the individual ISA effect by each seed, the effect of which is determined by the distance from the seed of interest. This novel algorithm is then compared with full MC water-based simulations (FMCW). Results: The results show that the dose perturbation model we propose is in excellent agreement with the FMCW values for a case with three seeds separated by 1 cm. The average difference of the model and the FMCW simulations was less than 8%±2%. Conclusion: Using the proposed novel analytical ISA effect model, one could expedite the corrections due to the ISA dose perturbation effects during permanent seed 103-Pd brachytherapy planning with minimal increase in time since the model is based on multiplications and superposition. This model can be applied, in principle, to any other brachytherapy seeds. Further work is necessary to validate this model on a more complicated geometry as well.« less
Wetherbee, Gregory A.; Martin, RoseAnn
2017-02-06
The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.
High-Resolution Sequence-Function Mapping of Full-Length Proteins
Kowalsky, Caitlin A.; Klesmith, Justin R.; Stapleton, James A.; Kelly, Vince; Reichkitzer, Nolan; Whitehead, Timothy A.
2015-01-01
Comprehensive sequence-function mapping involves detailing the fitness contribution of every possible single mutation to a gene by comparing the abundance of each library variant before and after selection for the phenotype of interest. Deep sequencing of library DNA allows frequency reconstruction for tens of thousands of variants in a single experiment, yet short read lengths of current sequencers makes it challenging to probe genes encoding full-length proteins. Here we extend the scope of sequence-function maps to entire protein sequences with a modular, universal sequence tiling method. We demonstrate the approach with both growth-based selections and FACS screening, offer parameters and best practices that simplify design of experiments, and present analytical solutions to normalize data across independent selections. Using this protocol, sequence-function maps covering full sequences can be obtained in four to six weeks. Best practices introduced in this manuscript are fully compatible with, and complementary to, other recently published sequence-function mapping protocols. PMID:25790064
DOE Office of Scientific and Technical Information (OSTI.GOV)
Handakumbura, Pubudu; Hixson, Kim K.; Purvine, Samuel O.
We present a simple one-pot extraction protocol, which rapidly isolates hydrophyllic metabolites, lipids, and proteins from the same pulverized plant sample. Also detailed is a global plant proteomics sample preparation method utilizing iTRAQ multiplexing reagents that enables deep proteome coverage due to the use of HPLC fractionation of the peptides prior to mass spectrometric analysis. We have successfully used this protocol on several different plant tissues (e.g., roots, stems, leaves) from different plants (e.g., sorghum, poplar, Arabidopsis, soybean), and have been able to successfully detect and quantify thousands of proteins. Multiplexing strategies such as iTRAQ and the bioinformatics strategy outlinedmore » here, ultimately provide insight into which proteins are significantly changed in abundance between two or more groups (e.g., control, perturbation). Our bioinformatics strategy yields z-score values, which normalize the expression data into a format that can easily be cross-compared with other expression data (i.e., metabolomics, transcriptomics) obtained from different analytical methods and instrumentation.« less
Williams, Devin M; Miller, Andy O; Henry, Michael W; Westrich, Geoffrey H; Ghomrawi, Hassan M K
2017-09-01
The risk of prosthetic joint infection increases with Staphylococcus aureus colonization. The cost-effectiveness of decolonization is controversial. We evaluated cost-effectiveness decolonization protocols in high-risk arthroplasty patients. An analytical model evaluated risk under 3 protocols: 4 swabs, 2 swabs, and nasal swab alone. These were compared to no-screening and universal decolonization strategies. Cost-effectiveness was evaluated from the hospital, patient, and societal perspective. Under base case conditions, universal decolonization and 4-swab strategies were most effective. The 2-swab and universal decolonization strategy were most cost-effective from patient and societal perspectives. From the hospital perspective, universal decolonization was the dominant strategy (much less costly and more effective). S aureus decolonization may be cost-effective for reducing prosthetic joint infections in high-risk patients. These results may have important implications for treatment of patients and for cost containment in a bundled payment system. Copyright © 2017 Elsevier Inc. All rights reserved.
Carbon Nanotube Material Quality Assessment
NASA Technical Reports Server (NTRS)
Yowell, Leonard; Arepalli, Sivaram; Sosa, Edward; Niolaev, Pavel; Gorelik, Olga
2006-01-01
The nanomaterial activities at NASA Johnson Space Center focus on carbon nanotube production, characterization and their applications for aerospace systems. Single wall carbon nanotubes are produced by arc and laser methods. Characterization of the nanotube material is performed using the NASA JSC protocol developed by combining analytical techniques of SEM, TEM, UV-VIS-NIR absorption, Raman, and TGA. A possible addition of other techniques such as XPS, and ICP to the existing protocol will be discussed. Changes in the quality of the material collected in different regions of the arc and laser production chambers is assessed using the original JSC protocol. The observed variations indicate different growth conditions in different regions of the production chambers.
Heavy vehicle driver workload assessment. Task 3, task analysis data collection
DOT National Transportation Integrated Search
This technical report consists of a collection of task analytic data to support heavy vehicle driver workload assessment and protocol development. Data were collected from professional drivers to provide insights into the following issues: the meanin...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2013 CFR
2013-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
42 CFR 493.1251 - Standard: Procedure manual.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic... intervals (normal values). (11) Imminently life-threatening test results, or panic or alert values. (12... reporting patient results including, when appropriate, the protocol for reporting imminently life...
Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R
2005-02-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.
Klein, Marguerite A.; Nahin, Richard L.; Messina, Mark J.; Rader, Jeanne I.; Thompson, Lilian U.; Badger, Thomas M.; Dwyer, Johanna T.; Kim, Young S.; Pontzer, Carol H.; Starke-Reed, Pamela E.; Weaver, Connie M.
2010-01-01
The NIH sponsored a scientific workshop, “Soy Protein/Isoflavone Research: Challenges in Designing and Evaluating Intervention Studies,” July 28–29, 2009. The workshop goal was to provide guidance for the next generation of soy protein/isoflavone human research. Session topics included population exposure to soy; the variability of the human response to soy; product composition; methods, tools, and resources available to estimate exposure and protocol adherence; and analytical methods to assess soy in foods and supplements and analytes in biologic fluids and other tissues. The intent of the workshop was to address the quality of soy studies, not the efficacy or safety of soy. Prior NIH workshops and an evidence-based review questioned the quality of data from human soy studies. If clinical studies are pursued, investigators need to ensure that the experimental designs are optimal and the studies properly executed. The workshop participants identified methodological issues that may confound study results and interpretation. Scientifically sound and useful options for dealing with these issues were discussed. The resulting guidance is presented in this document with a brief rationale. The guidance is specific to soy clinical research and does not address nonsoy-related factors that should also be considered in designing and reporting clinical studies. This guidance may be used by investigators, journal editors, study sponsors, and protocol reviewers for a variety of purposes, including designing and implementing trials, reporting results, and interpreting published epidemiological and clinical studies. PMID:20392880
Protocol for HER2 FISH determination on PAXgene-fixed and paraffin-embedded tissue in breast cancer.
Oberauner-Wappis, Lisa; Loibner, Martina; Viertler, Christian; Groelz, Daniel; Wyrich, Ralf; Zatloukal, Kurt
2016-04-01
Molecular diagnostics in personalized medicine increasingly relies on the combination of a variety of analytical technologies to characterize individual diseases and to select patients for targeted therapies. The gold standard for tissue-based diagnostics is fixation in formalin and embedding in paraffin, which results in excellent preservation of morphology but negatively impacts on a variety of molecular assays. The formalin-free, non-cross-linking PAXgene tissue system preserves morphology in a similar way to formalin, but also preserves biomolecules essentially in a similar way to cryopreservation, which markedly widens the spectrum, sensitivity and accuracy of molecular analytics. In this study, we have developed and tested a protocol for PAXgene-fixed and paraffin-embedded tissues for fluorescent in situ hybridization (FISH). The implementation of a 24-h formalin postfixation step of slides from PAXgene-fixed and paraffin-embedded tissues allowed us to use the assays approved for formalin-fixed and paraffin-embedded tissues. The equivalence of the methodologies was demonstrated by FISH analysis of HER2 amplification in breast cancer cases. The 24-h postfixation step of the slides used for FISH can be well integrated in the routine diagnostic workflow and allows the remaining PAXgene-fixed and paraffin-embedded tissue to be used for further molecular testing. © 2016 The Authors. International Journal of Experimental Pathology published by John Wiley & Sons Ltd on behalf of Company of the International Journal of Experimental Pathology (CIJEP).
Design and analysis issues for economic analysis alongside clinical trials.
Marshall, Deborah A; Hux, Margaret
2009-07-01
Clinical trials can offer a valuable and efficient opportunity to collect the health resource use and outcomes data for economic evaluation. However, economic and clinical studies differ fundamentally in the question they seek to answer. The design and analysis of trial-based cost-effectiveness studies require special consideration, which are reviewed in this article. Traditional randomized controlled trials, using an experimental design with a controlled protocol, are designed to measure safety and efficacy for product registration. Cost-effectiveness analysis seeks to measure effectiveness in the context of routine clinical practice, and requires collection of health care resources to allow estimation of cost over an equal timeframe for each treatment alternative. In assessing suitability of a trial for economic data collection, the comparator treatment and other protocol factors need to reflect current clinical practice and the trial follow-up must be sufficiently long to capture important costs and effects. The broadest available population and a measure of effectiveness reflecting important benefits for patients are preferred for economic analyses. Special analytical issues include dealing with missing and censored cost data, assessing uncertainty of the incremental cost-effectiveness ratio, and accounting for the underlying heterogeneity in patient subgroups. Careful consideration also needs to be given to data from multinational studies since practice patterns can differ across countries. Although clinical trials can be an efficient opportunity to collect data for economic evaluation, careful consideration of the suitability of the study design, and appropriate analytical methods must be applied to obtain rigorous results.
A CAD system and quality assurance protocol for bone age assessment utilizing digital hand atlas
NASA Astrophysics Data System (ADS)
Gertych, Arakadiusz; Zhang, Aifeng; Ferrara, Benjamin; Liu, Brent J.
2007-03-01
Determination of bone age assessment (BAA) in pediatric radiology is a task based on detailed analysis of patient's left hand X-ray. The current standard utilized in clinical practice relies on a subjective comparison of the hand with patterns in the book atlas. The computerized approach to BAA (CBAA) utilizes automatic analysis of the regions of interest in the hand image. This procedure is followed by extraction of quantitative features sensitive to skeletal development that are further converted to a bone age value utilizing knowledge from the digital hand atlas (DHA). This also allows providing BAA results resembling current clinical approach. All developed methodologies have been combined into one CAD module with a graphical user interface (GUI). CBAA can also improve the statistical and analytical accuracy based on a clinical work-flow analysis. For this purpose a quality assurance protocol (QAP) has been developed. Implementation of the QAP helped to make the CAD more robust and find images that cannot meet conditions required by DHA standards. Moreover, the entire CAD-DHA system may gain further benefits if clinical acquisition protocol is modified. The goal of this study is to present the performance improvement of the overall CAD-DHA system with QAP and the comparison of the CAD results with chronological age of 1390 normal subjects from the DHA. The CAD workstation can process images from local image database or from a PACS server.
The Structured Inventory of Malingered Symptomatology (SIMS): a systematic review and meta-analysis.
van Impelen, Alfons; Merckelbach, Harald; Jelicic, Marko; Merten, Thomas
2014-01-01
We meta-analytically reviewed studies that used the Structured Inventory of Malingered Symptomatology (SIMS) to detect feigned psychopathology. We present weighted mean diagnostic accuracy and predictive power indices in various populations, based on 31 studies, including 61 subsamples and 4009 SIMS protocols. In addition, we provide normative data of patients, claimants, defendants, nonclinical adults, and various experimental feigners, based on 41 studies, including 125 subsamples and 4810 SIMS protocols. We conclude that the SIMS (1) is able to differentiate well between instructed feigners and honest responders; (2) generates heightened scores in groups that are known to have a raised prevalence of feigning (e.g., offenders who claim crime-related amnesia); (3) may overestimate feigning in patients who suffer from schizophrenia, intellectual disability, or psychogenic non-epileptic seizures; and (4) is fairly robust against coaching. The diagnostic power of the traditional cut scores of the SIMS (i.e., > 14 and > 16) is not so much limited by their sensitivity—which is satisfactory—but rather by their substandard specificity. This, however, can be worked around by combining the SIMS with other symptom validity measures and by raising the cut score, although the latter solution sacrifices sensitivity for specificity.
Benjamin, Richard J; McDonald, Carl P
2014-04-01
The BacT/ALERT microbial detection system (bioMerieux, Inc, Durham, NC) is in routine use in many blood centers as a prerelease test for platelet collections. Published reports document wide variation in practices and outcomes. A systematic review of the English literature was performed to describe publications assessing the use of the BacT/ALERT culture system on platelet collections as a routine screen test of more than 10000 platelet components. Sixteen publications report the use of confirmatory testing to substantiate initial positive culture results but use varying nomenclature to classify the results. Preanalytical and analytical variables that may affect the outcomes differ widely between centers. Incomplete description of protocol details complicates comparison between sites. Initial positive culture results range from 539 to 10606 per million (0.054%-1.061%) and confirmed positive from 127 to 1035 per million (0.013%-0.104%) donations. False-negative results determined by outdate culture range from 662 to 2173 per million (0.066%-0.217%) and by septic reactions from 0 to 66 per million (0%-0.007%) collections. Current culture protocols represent pragmatic compromises between optimizing analytical sensitivity and ensuring the timely availability of platelets for clinical needs. Insights into the effect of protocol variations on outcomes are generally restricted to individual sites that implement limited changes to their protocols over time. Platelet manufacturers should reassess the adequacy of their BacT/ALERT screening protocols in light of the growing international experience and provide detailed documentation of all variables that may affect culture outcomes when reporting results. We propose a framework for a standardized nomenclature for reporting of the results of BacT/ALERT screening. Copyright © 2014 Elsevier Inc. All rights reserved.
Law, Martin; Ma, Wang-Kei; Lau, Damian; Cheung, Kenneth; Ip, Janice; Yip, Lawrance; Lam, Wendy
2018-04-01
To evaluate and to obtain analytic formulation for the calculation of the effective dose and associated cancer risk using the EOS microdose protocol for scoliotic pediatric patients undergoing full spine imaging at different age of exposure; to demonstrate the microdose protocol capable of delivering lesser radiation dose and hence of further reducing cancer risk induction when compared with the EOS low dose protocol; to obtain cumulative effective dose and cancer risk for both genders scoliotic pediatrics of US and Hong Kong population using the microdose protocol. Organ absorbed doses of full spine exposed scoliotic pediatric patients have been simulated with the use of EOS microdose protocol imaging parameters input to the Monte Carlo software PCXMC. Gender and age specific effective dose has been calculated with the simulated organ absorbed dose using the ICRP-103 approach. The associated radiation induced cancer risk, expressed as lifetime attributable risk (LAR), has been estimated according to the method introduced in the Biological Effects of Ionizing Radiation VII report. Values of LAR have been estimated for scoliotic patients exposed repetitively during their follow up period at different age for US and Hong Kong population. The effective doses of full spine imaging with simultaneous posteroanterior and lateral projection for patients exposed at the age between 5 and 18 years using the EOS microdose protocol have been calculated within the range of 2.54-14.75 μSv. The corresponding LAR for US and Hong Kong population was ranged between 0.04 × 10 -6 and 0.84 × 10 -6 . Cumulative effective dose and cancer risk during follow-up period can be estimated using the results and are of information to patients and their parents. With the use of computer simulation and analytic formulation, we obtained the cumulative effective dose and cancer risk at any age of exposure for pediatric patients of US and Hong Kong population undergoing repetitive microdose protocol full spine imaging. Girls would be at a statistically significant higher cumulative cancer risk than boys undergoing the same microdose full spine imaging protocol and the same follow-up schedule. Copyright © 2018 Elsevier B.V. All rights reserved.
Bellik, Yuva; Iguer-Ouada, Mokrane
2016-01-01
In past decades, a multitude of analytical methods for measuring antioxidant activity of plant extracts has been developed. However, when using methods to determine hemoglobin released from human erythrocytes treated with ginger extracts, we found hemoglobin concentrations were significantly higher than in untreated control samples. This suggests in the presence of antioxidants that measuring hemoglobin alone is not sufficient to determine hemolysis. We show concurrent measurement of erythrocyte concentration and hemoglobin is essential in such assays, and describe a new protocol based on simultaneous measurement of cellular turbidity and hemoglobin. Copyright © 2015 Elsevier Ltd. All rights reserved.
Derivatization in gas chromatographic determination of phenol and aniline traces in aqueous media
NASA Astrophysics Data System (ADS)
Gruzdev, I. V.; Zenkevich, I. G.; Kondratenok, B. M.
2015-06-01
Substituted anilines and phenols are the most common hydrophilic organic environmental toxicants. The principles of gas chromatographic determination of trace amounts of these compounds in aqueous media at concentrations <=0.1 μg litre-1 based on synthesis of their derivatives (derivatization) directly in the aqueous phase are considered. Conversion of relatively hydrophilic analytes into more hydrophobic derivatives makes it possible to achieve such low detection limits and optimize the protocols of extractive preconcentration and selective chromatographic detection. Among the known reactions, this condition is best met by electrophilic halogenation of compounds at the aromatic moiety. The bibliography includes 177 references.
Eckard, Anahita D; Dupont, David R; Young, Johnie K
2018-01-01
N -lined glycosylation is one of the critical quality attributes (CQA) for biotherapeutics impacting the safety and activity of drug product. Changes in pattern and level of glycosylation can significantly alter the intrinsic properties of the product and, therefore, have to be monitored throughout its lifecycle. Therefore fast, precise, and unbiased N -glycan mapping assay is desired. To ensure these qualities, using analytical methods that evaluate completeness of deglycosylation is necessary. For quantification of deglycosylation yield, methods such as reduced liquid chromatography-mass spectrometry (LC-MS) and reduced capillary gel electrophoresis (CGE) have been commonly used. Here we present development of two additional methods to evaluate deglycosylation yield: one based on LC using reverse phase (RP) column and one based on reduced sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE gel) with offline software (GelAnalyzer). With the advent of rapid deglycosylation workflows in the market for N -glycan profiling replacing overnight incubation, we have aimed to quantify the level of deglycosylation in a selected rapid deglycosylation workflow. Our results have shown well resolved peaks of glycosylated and deglycosylated protein species with RP-LC method allowing simple quantification of deglycosylation yield of protein with high confidence. Additionally a good correlation, ≥0.94, was found between deglycosylation yields estimated by RP-LC method and that of reduced SDS-PAGE gel method with offline software. Evaluation of rapid deglycosylation protocol from GlycanAssure™ HyPerformance assay kit performed on fetuin and RNase B has shown complete deglycosylation within the recommended protocol time when evaluated with these techniques. Using this kit, N -glycans from NIST mAb were prepared in 1.4 hr and analyzed by hydrophilic interaction chromatography (HILIC) ultrahigh performance LC (UHPLC) equipped with a fluorescence detector (FLD). 37 peaks were resolved with good resolution. Excellent sample preparation repeatability was found with relative standard deviation (RSD) of <5% for peaks with >0.5% relative area.
Kuczynska, Paulina; Jemiola-Rzeminska, Malgorzata
2017-01-01
Two diatom-specific carotenoids are engaged in the diadinoxanthin cycle, an important mechanism which protects these organisms against photoinhibition caused by absorption of excessive light energy. A high-performance and economical procedure of isolation and purification of diadinoxanthin and diatoxanthin from the marine diatom Phaeodactylum tricornutum using a four-step procedure has been developed. It is based on the use of commonly available materials and does not require advanced technology. Extraction of pigments, saponification, separation by partition and then open column chromatography, which comprise the complete experimental procedure, can be performed within 2 days. This method allows HPLC grade diadinoxanthin and diatoxanthin of a purity of 99 % or more to be obtained, and the efficiency was estimated to be 63 % for diadinoxanthin and 73 % for diatoxanthin. Carefully selected diatom culture conditions as well as analytical ones ensure highly reproducible performance. A protocol can be used to isolate and purify the diadinoxanthin cycle pigments both on analytical and preparative scale.
RapidIO as a multi-purpose interconnect
NASA Astrophysics Data System (ADS)
Baymani, Simaolhoda; Alexopoulos, Konstantinos; Valat, Sébastien
2017-10-01
RapidIO (http://rapidio.org/) technology is a packet-switched high-performance fabric, which has been under active development since 1997. Originally meant to be a front side bus, it developed into a system level interconnect which is today used in all 4G/LTE base stations world wide. RapidIO is often used in embedded systems that require high reliability, low latency and scalability in a heterogeneous environment - features that are highly interesting for several use cases, such as data analytics and data acquisition (DAQ) networks. We will present the results of evaluating RapidIO in a data analytics environment, from setup to benchmark. Specifically, we will share the experience of running ROOT and Hadoop on top of RapidIO. To demonstrate the multi-purpose characteristics of RapidIO, we will also present the results of investigating RapidIO as a technology for high-speed DAQ networks using a generic multi-protocol event-building emulation tool. In addition we will present lessons learned from implementing native ports of CERN applications to RapidIO.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, J.; Giam, C.S.
Polynuclear azaarenes in a creosote-pentachlorophenol wood preservative wastewater were analyzed. The total concentration of azaarenes was determined to be 1300 mg kg/sup -1/. Potential adverse effects of these compounds on environmental quality and health suggest a need to develop analytical protocols for measuing azaarenes in hazardous wastes.
It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two…).
Rummel, John D; Kminek, Gerhard
2018-04-01
The last time NASA envisioned a sample return mission from Mars, the development of a protocol to support the analysis of the samples in a containment facility resulted in a "Draft Test Protocol" that outlined required preparations "for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth" (Rummel et al., 2002 ). This document comprised a specific protocol to be used to conduct a biohazard test for a returned martian sample, following the recommendations of the Space Studies Board of the US National Academy of Sciences. Given the planned launch of a sample-collecting and sample-caching rover (Mars 2020) in 2 years' time, and with a sample return planned for the end of the next decade, it is time to revisit the Draft Test Protocol to develop a sample analysis and biohazard test plan to meet the needs of these future missions. Key Words: Biohazard detection-Mars sample analysis-Sample receiving facility-Protocol-New analytical techniques-Robotic sample handling. Astrobiology 18, 377-380.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.
The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less
A surface plasmon resonance based biochip for the detection of patulin toxin
NASA Astrophysics Data System (ADS)
Pennacchio, Anna; Ruggiero, Giuseppe; Staiano, Maria; Piccialli, Gennaro; Oliviero, Giorgia; Lewkowicz, Aneta; Synak, Anna; Bojarski, Piotr; D'Auria, Sabato
2014-08-01
Patulin is a toxic secondary metabolite of a number of fungal species belonging to the genera Penicillium and Aspergillus. One important aspect of the patulin toxicity in vivo is an injury of the gastrointestinal tract including ulceration and inflammation of the stomach and intestine. Recently, patulin has been shown to be genotoxic by causing oxidative damage to the DNA, and oxidative DNA base modifications have been considered to play a role in mutagenesis and cancer initiation. Conventional analytical methods for patulin detection involve chromatographic analyses, such as HPLC, GC, and, more recently, techniques such as LC/MS and GC/MS. All of these methods require the use of extensive protocols and the use of expensive analytical instrumentation. In this work, the conjugation of a new derivative of patulin to the bovine serum albumin for the production of polyclonal antibodies is described, and an innovative competitive immune-assay for detection of patulin is presented. Experimentally, an important part of the detection method is based on the optical technique called surface plasmon resonance (SPR). Laser beam induced interactions between probe and target molecules in the vicinity of gold surface of the biochip lead to the shift in resonance conditions and consequently to slight but easily detectable change of reflectivity.
Naccarato, Attilio; Elliani, Rosangela; Cavaliere, Brunella; Sindona, Giovanni; Tagarelli, Antonio
2018-05-11
Polyamines are aliphatic amines with low molecular weight that are widely recognized as one of the most important cancer biomarkers for early diagnosis and treatment. The goal of the work herein presented is the development of a rapid and simple method for the quantification of free polyamines (i.e., putrescine, cadaverine, spermidine, spermine) and N-monoacetylated polyamines (i.e., N 1 -Acetylspermidine, N 8 -Acetylspermidine, and N 1 -Acetylspermine) in human urine. A preliminary derivatization with propyl chloroformate combined with the use of solid phase microextraction (SPME) allowed for an easy and automatable protocol involving minimal sample handling and no consumption of organic solvents. The affinity of the analytes toward five commercial SPME coatings was evaluated in univariate mode, and the best result in terms of analyte extraction was achieved using the divinylbenzene/carboxen/polydimethylsiloxane fiber. The variables affecting the performance of SPME analysis were optimized by the multivariate approach of experimental design and, in particular, using a central composite design (CCD). The optimal working conditions in terms of response values are the following: extraction temperature 40 °C, extraction time of 15 min and no addition of NaCl. Analyses were carried out by gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS) in selected reaction monitoring (SRM) acquisition mode. The developed method was validated according to the guidelines issued by the Food and Drug Administration (FDA). The satisfactory performances reached in terms of linearity, sensitivity (LOQs between 0.01 and 0.1 μg/mL), matrix effect (68-121%), accuracy, and precision (inter-day values between -24% and +16% and in the range 3.3-28.4%, respectively) make the proposed protocol suitable to be adopted for quantification of these important biomarkers in urine samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Qiu, Ling; Erasmus, Rajiv; Borai, Anwar; Evgina, Svetlana; Ashavaid, Tester; Khan, Dilshad; Schreier, Laura; Rolle, Reynan; Shimizu, Yoshihisa; Kimura, Shogo; Kawano, Reo; Armbruster, David; Mori, Kazuo; Yadav, Binod K
2017-04-01
The IFCC Committee on Reference Intervals and Decision Limits coordinated a global multicenter study on reference values (RVs) to explore rational and harmonizable procedures for derivation of reference intervals (RIs) and investigate the feasibility of sharing RIs through evaluation of sources of variation of RVs on a global scale. For the common protocol, rather lenient criteria for reference individuals were adopted to facilitate harmonized recruitment with planned use of the latent abnormal values exclusion (LAVE) method. As of July 2015, 12 countries had completed their study with total recruitment of 13,386 healthy adults. 25 analytes were measured chemically and 25 immunologically. A serum panel with assigned values was measured by all laboratories. RIs were derived by parametric and nonparametric methods. The effect of LAVE methods is prominent in analytes which reflect nutritional status, inflammation and muscular exertion, indicating that inappropriate results are frequent in any country. The validity of the parametric method was confirmed by the presence of analyte-specific distribution patterns and successful Gaussian transformation using the modified Box-Cox formula in all countries. After successful alignment of RVs based on the panel test results, nearly half the analytes showed variable degrees of between-country differences. This finding, however, requires confirmation after adjusting for BMI and other sources of variation. The results are reported in the second part of this paper. The collaborative study enabled us to evaluate rational methods for deriving RIs and comparing the RVs based on real-world datasets obtained in a harmonized manner. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Barton, Zachary J; Rodríguez-López, Joaquín
2017-03-07
Scanning electrochemical microscopy (SECM) is a rising technique for the study of energy storage materials. Hg-based probes allow the extension of SECM investigations to ionic processes, but the risk of irreversible Hg amalgam saturation limits their operation to rapid timescales and dilute analyte solutions. Here, we report a novel fabrication protocol for Hg disc-well ultramicroelectrodes (UMEs), which retain access to stripping information but are less susceptible to amalgam saturation than traditional Hg sphere-caps or thin-films. The amalgamation and stripping behaviors of Hg disc-well UMEs are compared to those of traditional Hg sphere-cap UMEs and corroborated with data from finite element simulations. The improved protection against amalgam saturation allows Hg disc-wells to operate safely in highly concentrated environments at long timescales. The utility of the probes for bulk measurements extends also to SECM studies, where the disc geometry facilitates small tip-substrate gaps and improves both spatial and temporal resolution. Because they can carry out slow, high-resolution anodic stripping voltammetry approaches and imaging in concentrated solutions, Hg disc-well electrodes fill a new analytical niche for studies of ionic reactivity and are a valuable addition to the electrochemical toolbox.
Vanderford, Brett J; Mawhinney, Douglas B; Trenholm, Rebecca A; Zeigler-Holady, Janie C; Snyder, Shane A
2011-02-01
Proper collection and preservation techniques are necessary to ensure sample integrity and maintain the stability of analytes until analysis. Data from improperly collected and preserved samples could lead to faulty conclusions and misinterpretation of the occurrence and fate of the compounds being studied. Because contaminants of emerging concern, such as pharmaceuticals and personal care products (PPCPs) and steroids, generally occur in surface and drinking water at ng/L levels, these compounds in particular require such protocols to accurately assess their concentrations. In this study, sample bottle types, residual oxidant quenching agents, preservation agents, and hold times were assessed for 21 PPCPs and steroids in surface water and finished drinking water. Amber glass bottles were found to have the least effect on target analyte concentrations, while high-density polyethylene bottles had the most impact. Ascorbic acid, sodium thiosulfate, and sodium sulfite were determined to be acceptable quenching agents and preservation with sodium azide at 4 °C led to the stability of the most target compounds. A combination of amber glass bottles, ascorbic acid, and sodium azide preserved analyte concentrations for 28 days in the tested matrices when held at 4 °C. Samples without a preservation agent were determined to be stable for all but two of the analytes when stored in amber glass bottles at 4 °C for 72 h. Results suggest that if improper protocols are utilized, reported concentrations of target PPCPs and steroids may be inaccurate.
Song, Yuelin; Song, Qingqing; Li, Jun; Zheng, Jiao; Li, Chun; Zhang, Yuan; Zhang, Lingling; Jiang, Yong; Tu, Pengfei
2016-07-08
Direct analysis is of great importance to understand the real chemical profile of a given sample, notably biological materials, because either chemical degradation or diverse errors and uncertainties might be resulted from sophisticated protocols. In comparison with biofluids, it is still challenging for direct analysis of solid biological samples using high performance liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Herein, a new analytical platform was configured by online hyphenating pressurized liquid extraction (PLE), turbulent flow chromatography (TFC), and LC-MS/MS. A facile, but robust PLE module was constructed based on the phenomenon that noticeable back-pressure can be generated during rapid fluid passing through a narrow tube. TFC column that is advantageous at extracting low molecular analytes from rushing fluid was employed to link at the outlet of the PLE module to capture constituents-of-interest. An electronic 6-port/2-position valve was introduced between TFC column and LC-MS/MS to fragment each measurement into extraction and elution phases, whereas LC-MS/MS took the charge of analyte separation and monitoring. As a proof of concept, simultaneous determination of 24 endogenous substances including eighteen steroids, five eicosanoids, and one porphyrin in feces was carried out in this paper. Method validation assays demonstrated the analytical platform to be qualified for directly simultaneous measurement of diverse endogenous analytes in fecal matrices. Application of this integrated platform on homolog-focused profiling of feces is discussed in a companion paper. Copyright © 2016 Elsevier B.V. All rights reserved.
Tan, Joel Ming Rui; Ruan, Justina Jiexin; Lee, Hiang Kwee; Phang, In Yee; Ling, Xing Yi
2014-12-28
An analytical platform with an ultratrace detection limit in the atto-molar (aM) concentration range is vital for forensic, industrial and environmental sectors that handle scarce/highly toxic samples. Superhydrophobic surface-enhanced Raman scattering (SERS) platforms serve as ideal platforms to enhance detection sensitivity by reducing the random spreading of aqueous solution. However, the fabrication of superhydrophobic SERS platforms is generally limited due to the use of sophisticated and expensive protocols and/or suffers structural and signal inconsistency. Herein, we demonstrate a high-throughput fabrication of a stable and uniform superhydrophobic SERS platform for ultratrace molecular sensing. Large-area box-like micropatterns of the polymeric surface are first fabricated using capillary force lithography (CFL). Subsequently, plasmonic properties are incorporated into the patterned surfaces by decorating with Ag nanocubes using the Langmuir-Schaefer technique. To create a stable superhydrophobic SERS platform, an additional 25 nm Ag film is coated over the Ag nanocube-decorated patterned template followed by chemical functionalization with perfluorodecanethiol. Our resulting superhydrophobic SERS platform demonstrates excellent water-repellency with a static contact angle of 165° ± 9° and a consequent analyte concentration factor of 59-fold, as compared to its hydrophilic counterpart. By combining the analyte concentration effect of superhydrophobic surfaces with the intense electromagnetic "hot spots" of Ag nanocubes, our superhydrophobic SERS platform achieves an ultra-low detection limit of 10(-17) M (10 aM) for rhodamine 6G using just 4 μL of analyte solutions, corresponding to an analytical SERS enhancement factor of 10(13). Our fabrication protocol demonstrates a simple, cost- and time-effective approach for the large-scale fabrication of a superhydrophobic SERS platform for ultratrace molecular detection.
Considerations in detecting CDC select agents under field conditions
NASA Astrophysics Data System (ADS)
Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul
2008-04-01
Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.
Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks
NASA Astrophysics Data System (ADS)
Hortos, William S.
The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.
Precise determination of triple Sr isotopes (δ⁸⁷Sr and δ⁸⁸Sr) using MC-ICP-MS.
Liu, Hou-Chun; You, Chen-Feng; Huang, Kuo-Fang; Chung, Chuan-Hsiung
2012-01-15
The non-traditional stable strontium (Sr) isotopes have received increasing attention recently as new geochemical tracers for studying Sr isotopic fractionation and source identification. This has been attributed to the advancement in multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS), allows to determine precisely and simultaneously of the triple Sr isotopes. In this study, we applied a modified empirical external normalization (EEN) MC-ICPMS procedure for mass bias correction in Sr isotopic measurement using (92)Zr/(90)Zr. High-purity Zr Standard was spiked into sample solutions and the degree of fractionation was calculated off-line using an exponential law. The long-term external reproducibility for NIST SRM 987 δ(87)Sr and δ(88)Sr was better than 0.040‰ and 0.018‰ (2SD), respectively. The IAPSO standard seawater was used as a secondary standard to validate the analytical protocol and the absolute ratios measured were 0.709161±0.000018 for (87)Sr/(86)Sr, 0.177±0.021‰ for δ(87)Sr, and 0.370±0.026‰ for δ(88)Sr (2SD, n=7). These values are in good agreement with the literature data analyzed by thermal ionization mass spectrometry (TIMS) double spike technique. Rock standards, BHVO-2, BCR-2 and AGV-2 were also analyzed to validate the robustness of the methodology and showed identical results with literature data. Compared to previous (91)Zr/(90)Zr correction, we obtained improved results based on (92)Zr/(90)Zr, probably due to similar mass difference between (92)Zr/(90)Zr and measured Sr isotopes. The new analytical protocol presented in this study not only improves the analytical precision but also increases sample efficiency by omitting the use of the standard-sample bracketing (SSB) procedure. Copyright © 2011 Elsevier B.V. All rights reserved.
One-sided measurement-device-independent quantum key distribution
NASA Astrophysics Data System (ADS)
Cao, Wen-Fei; Zhen, Yi-Zheng; Zheng, Yu-Lin; Li, Li; Chen, Zeng-Bing; Liu, Nai-Le; Chen, Kai
2018-01-01
Measurement-device-independent quantum key distribution (MDI-QKD) protocol was proposed to remove all the detector side channel attacks, while its security relies on the trusted encoding systems. Here we propose a one-sided MDI-QKD (1SMDI-QKD) protocol, which enjoys detection loophole-free advantage, and at the same time weakens the state preparation assumption in MDI-QKD. The 1SMDI-QKD can be regarded as a modified MDI-QKD, in which Bob's encoding system is trusted, while Alice's is uncharacterized. For the practical implementation, we also provide a scheme by utilizing coherent light source with an analytical two decoy state estimation method. Simulation with realistic experimental parameters shows that the protocol has a promising performance, and thus can be applied to practical QKD applications.
Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K
2000-08-01
A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.
An Organic Decontamination Method for Sampling Devices used in Life-detection Studies
NASA Technical Reports Server (NTRS)
Eigenbrode, Jennifer; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E.F.
2008-01-01
Organic decontamination of sampling and storage devices are crucial steps for life-detection, habitability, and ecological investigations of extremophiles living in the most inhospitable niches of Earth, Mars and elsewhere. However, one of the main stumbling blocks for Mars-analogue life-detection studies in terrestrial remote field-sites is the capability to clean instruments and sampling devices to organic levels consistent with null values. Here we present a new seven-step, multi-reagent cleaning and decontamination protocol that was adapted and tested on a glacial ice-coring device and on a rover-guided scoop used for sediment sampling both deployed multiple times during two field seasons of the Arctic Mars Analog Svalbard Expedition AMASE). The effectiveness of the protocols for both devices was tested by (1)in situ metabolic measurements via APT, (2)in situ lipopolysacchride (LPS) quantifications via low-level endotoxin assays, and(3) laboratory-based molecular detection via gas chromatography-mass spectrometry. Our results show that the combination and step-wise application of disinfectants with oxidative and solvation properties for sterilization are effective at removing cellular remnants and other organic traces to levels necessary for molecular organic- and life-detection studies. The validation of this seven-step protocol - specifically for ice sampling - allows us to proceed with confidence in kmskia4 analogue investigations of icy environments. However, results from a rover scoop test showed that this protocol is also suitable for null-level decontamination of sample acquisition devices. Thus, this protocol may be applicable to a variety of sampling devices and analytical instrumentation used for future astrobiology missions to Enceladus, and Europa, as well as for sample-return missions.
Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett
2015-08-28
Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with <1mL of sample. This new approach provides a reliable and robust analytical method for the simultaneous determination of organic and inorganic anions, including fluoride, methanesulfonate, chloride, sulfate and nitrate anions. Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.
Methods for CT automatic exposure control protocol translation between scanner platforms.
McKenney, Sarah E; Seibert, J Anthony; Lamba, Ramit; Boone, John M
2014-03-01
An imaging facility with a diverse fleet of CT scanners faces considerable challenges when propagating CT protocols with consistent image quality and patient dose across scanner makes and models. Although some protocol parameters can comfortably remain constant among scanners (eg, tube voltage, gantry rotation time), the automatic exposure control (AEC) parameter, which selects the overall mA level during tube current modulation, is difficult to match among scanners, especially from different CT manufacturers. Objective methods for converting tube current modulation protocols among CT scanners were developed. Three CT scanners were investigated, a GE LightSpeed 16 scanner, a GE VCT scanner, and a Siemens Definition AS+ scanner. Translation of the AEC parameters such as noise index and quality reference mAs across CT scanners was specifically investigated. A variable-diameter poly(methyl methacrylate) phantom was imaged on the 3 scanners using a range of AEC parameters for each scanner. The phantom consisted of 5 cylindrical sections with diameters of 13, 16, 20, 25, and 32 cm. The protocol translation scheme was based on matching either the volumetric CT dose index or image noise (in Hounsfield units) between two different CT scanners. A series of analytic fit functions, corresponding to different patient sizes (phantom diameters), were developed from the measured CT data. These functions relate the AEC metric of the reference scanner, the GE LightSpeed 16 in this case, to the AEC metric of a secondary scanner. When translating protocols between different models of CT scanners (from the GE LightSpeed 16 reference scanner to the GE VCT system), the translation functions were linear. However, a power-law function was necessary to convert the AEC functions of the GE LightSpeed 16 reference scanner to the Siemens Definition AS+ secondary scanner, because of differences in the AEC functionality designed by these two companies. Protocol translation on the basis of quantitative metrics (volumetric CT dose index or measured image noise) is feasible. Protocol translation has a dependency on patient size, especially between the GE and Siemens systems. Translation schemes that preserve dose levels may not produce identical image quality. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
NASA Astrophysics Data System (ADS)
Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.
2015-04-01
Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.
SMAC: A soft MAC to reduce control overhead and latency in CDMA-based AMI networks
Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.; ...
2015-10-26
The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less
40 CFR 61.356 - Recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... test protocol and the means by which sampling variability and analytical variability were accounted for... also establish the design minimum and average temperature in the combustion zone and the combustion... the design minimum and average temperatures across the catalyst bed inlet and outlet. (C) For a boiler...
HYDROLYSIS OF MTBE IN GROUND WATER SAMPLES PRESERVED WITIH HYDROCHLORIC ACID
Conventional sampling and analytical protocols have poor sensitivity for fuel oxygenates that are alcohols, such as TBA. Because alcohols tend to stay with the water samples, they are not efficiently transferred to the gas chromatograph for separation and analysis. A common tec...
HYDROLYSIS OF MTBE TO TBA IN GROUND WATER SAMPLES WITH HYDROCHLORIC ACID
Conventional sampling and analytical protocols have poor sensitivity for fuel oxygenates that are alcohols, such as tert-butyl alcohol (TBA). Because alcohols are miscible or highly soluble in water, alcohols are not efficiently transferred to the gas chromatograph for analysis....
Studies of ectomycorrhizal community structure have used a variety of analytical regimens including sole or partial reliance on gross morphological characterization of colonized root tips. Depending on the rigor of the classification protocol, this technique can incorrectly assig...
ANALYTICAL METHODS FOR FUEL OXYGENATES
MTBE (and potentially any other oxygenate) may be present at any petroleum UST site, whether the release is new or old, virtually anywhere in the United States. Consequently, it is prudent to analyze samples for the entire suite of oxygenates as identified in this protocol (i.e....
Noble, Stephen R; Horstwood, Matthew S A; Davy, Pamela; Pashley, Vanessa; Spiro, Baruch; Smith, Steve
2008-07-01
Pb isotope compositions of biologically significant PM(10) atmospheric particulates from a busy roadside location in London UK were measured using solution- and laser ablation-mode MC-ICP-MS. The solution-mode data for PM(10) sampled between 1998-2001 document a dramatic shift to increasingly radiogenic compositions as leaded petrol was phased out. LA-MC-ICP-MS isotope analysis, piloted on a subset of the available samples, is shown to be a potential reconnaissance analytical technique. PM(10) particles trapped on quartz filters were liberated from the filter surface, without ablating the filter substrate, using a 266 nm UV laser and a dynamic, large diameter, low-fluence ablation protocol. The Pb isotope evolution noted in the London data set obtained by both analytical protocols is similar to that observed elsewhere in Western Europe following leaded petrol elimination. The data therefore provide important baseline isotope composition information useful for continued UK atmospheric monitoring through the early 21(st) century.
Protocol for Tier 2 Evaluation of Vapor Intrusion at Corrective Action Sites
2012-07-01
622.92 600.12 437.08 433.44 411.1 Sulfur Hexafluoride (SF6) by NIOSH 6602 Modified Sulfur Hexafluoride 600 130 380 290 120 370 Notes: 1. VOC and SF6...6602 Modified Sulfur Hexafluoride 2400 2600 24 1500 260 14 18 1000 Notes: 1. VOC and SF6 samples were analyzed by Columbia Analytical Services, Inc. in...NIOSH 6602 Modified Sulfur Hexafluoride 3900 15 1800 1700 24 1600 Notes: 1. VOC and SF6 samples were analyzed by Columbia Analytical Services, Inc. in
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
CT protocol management: simplifying the process by using a master protocol concept.
Szczykutowicz, Timothy P; Bour, Robert K; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N
2015-07-08
This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two-tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade-offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Kumar, B. Vinodh; Mohan, Thuthi
2018-01-01
OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587
Operator assistant systems - An experimental approach using a telerobotics application
NASA Technical Reports Server (NTRS)
Boy, Guy A.; Mathe, Nathalie
1993-01-01
This article presents a knowledge-based system methodology for developing operator assistant (OA) systems in dynamic and interactive environments. This is a problem both of training and design, which is the subject of this article. Design includes both design of the system to be controlled and design of procedures for operating this system. A specific knowledge representation is proposed for representing the corresponding system and operational knowledge. This representation is based on the situation recognition and analytical reasoning paradigm. It tries to make explicit common factors involved in both human and machine intelligence, including perception and reasoning. An OA system based on this representation has been developed for space telerobotics. Simulations have been carried out with astronauts and the resulting protocols have been analyzed. Results show the relevance of the approach and have been used for improving the knowledge representation and the OA architecture.
Rodriguez-Canales, Jaime; Hanson, Jeffrey C; Hipp, Jason D; Balis, Ulysses J; Tangrea, Michael A; Emmert-Buck, Michael R; Bova, G Steven
2013-01-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This updated chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high-quality, appropriately anatomically tagged scientific results. Improvement in this area will significantly increase life science quality and productivity. The chapter is divided into introduction, materials, protocols, and notes subheadings. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this chapter, readers are advised to read through the entire chapter first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong
2013-01-01
Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.
Qin, Guoxin; Zhao, Shulin; Huang, Yong; Jiang, Jing; Ye, Fanggui
2012-03-20
A competitive immunoassay based on chemiluminescence resonance energy transfer (CRET) on the magnetic beads (MBs) is developed for the detection of human immunoglobulin G (IgG). In this protocol, carboxyl-modified MBs were conjugated with horseradish peroxidase (HRP)-labeled goat antihuman IgG (HRP-anti-IgG) and incubated with a limited amount of fluorescein isothiocyanate (FITC)-labeled human IgG to immobilize the antibody-antigen immune complex on the surface of the MBs, which was further incubated with the target analyte (human IgG) for competitive immunoreaction and separated magnetically to remove the supernatant. The chemiluminescence (CL) buffer (containing luminol and H(2)O(2)) was then added, and the CRET from donor luminol to acceptor FITC in the immunocomplex on the surface of MBs occured immediately. The present protocol was evaluated for the competitive immunoassay of human IgG, and a linear relationship between CL intensity ratio (R = I(425)/I(525)) and human IgG concentration in the range of 0.2-4.0 nM was obtained with a correlation coefficient of 0.9965. The regression equation was expressed as R = 1.9871C + 2.4616, and a detection limit of 2.9 × 10(-11) M was obtained. The present method was successfully applied for the detection of IgG in human serum. The results indicate that the present protocol is quite promising for the application of CRET in immunoassays. It could also be developed for detection of other antigen-antibody immune complexes by using the corresponding antigens and respective antibodies.
Infinite horizon optimal impulsive control with applications to Internet congestion control
NASA Astrophysics Data System (ADS)
Avrachenkov, Konstantin; Habachi, Oussama; Piunovskiy, Alexey; Zhang, Yi
2015-04-01
We investigate infinite-horizon deterministic optimal control problems with both gradual and impulsive controls, where any finitely many impulses are allowed simultaneously. Both discounted and long-run time-average criteria are considered. We establish very general and at the same time natural conditions, under which the dynamic programming approach results in an optimal feedback policy. The established theoretical results are applied to the Internet congestion control, and by solving analytically and nontrivially the underlying optimal control problems, we obtain a simple threshold-based active queue management scheme, which takes into account the main parameters of the transmission control protocols, and improves the fairness among the connections in a given network.
NASA Astrophysics Data System (ADS)
Vermersch, B.; Elben, A.; Dalmonte, M.; Cirac, J. I.; Zoller, P.
2018-02-01
We present a general framework for the generation of random unitaries based on random quenches in atomic Hubbard and spin models, forming approximate unitary n -designs, and their application to the measurement of Rényi entropies. We generalize our protocol presented in Elben et al. [Phys. Rev. Lett. 120, 050406 (2018), 10.1103/PhysRevLett.120.050406] to a broad class of atomic and spin-lattice models. We further present an in-depth numerical and analytical study of experimental imperfections, including the effect of decoherence and statistical errors, and discuss connections of our approach with many-body quantum chaos.
Bioreactor Steroid Production and Analysis of Date Palm Embryogenic Callus.
El-Sharabasy, Sherif; El-Dawayati, Maiada
2017-01-01
Several compounds and families of compounds of date palm secondary metabolites have been investigated. The analysis of date palm tissue has shown the abundance of secondary metabolites including phytosterols, e.g., steroids, an important group of pharmaceutical compounds. Biotechnology offers the opportunity to utilize cells, tissues, and organs grown in vitro and manipulated to obtain desired compounds. This chapter presents a protocol for the production, determination, and identification of steroids in date palm callus tissue. The addition of 0.01 mg/L pyruvic acid as a precursor to MS liquid culture medium enhances steroid production. In addition, the chapter describes the sterol analytical techniques based on gas-liquid chromatography and gas chromatography-mass spectrometry.
Error recovery in shared memory multiprocessors using private caches
NASA Technical Reports Server (NTRS)
Wu, Kun-Lung; Fuchs, W. Kent; Patel, Janak H.
1990-01-01
The problem of recovering from processor transient faults in shared memory multiprocesses systems is examined. A user-transparent checkpointing and recovery scheme using private caches is presented. Processes can recover from errors due to faulty processors by restarting from the checkpointed computation state. Implementation techniques using checkpoint identifiers and recovery stacks are examined as a means of reducing performance degradation in processor utilization during normal execution. This cache-based checkpointing technique prevents rollback propagation, provides rapid recovery, and can be integrated into standard cache coherence protocols. An analytical model is used to estimate the relative performance of the scheme during normal execution. Extensions to take error latency into account are presented.
Systems Biology Approach in Hypertension Research.
Delles, Christian; Husi, Holger
2017-01-01
Systems biology is an approach to study all genes, gene transcripts, proteins, metabolites, and their interactions in specific cells, tissues, organs, or the whole organism. It is based on data derived from high-throughput analytical technologies and bioinformatics tools to analyze these data, and aims to understand the whole system rather than individual aspects of it. Systems biology can be applied to virtually all conditions and diseases and therefore also to hypertension and its underlying vascular disorders. Unlike other methods in this book there is no clear-cut protocol to explain a systems biology approach. We will instead outline some of the most important and common steps in the generation and analysis of systems biology data.
Time- and cost-saving apparatus for analytical sample filtration
William R. Kenealy; Joseph C. Destree
2005-01-01
Simple and cost-effective protocols were developed for removing particulates from samples prior to analysis by high performance liquid chromatography and gas chromatography. A filter and vial holder were developed for use with a 96-well filtration plate. The device saves preparation time and costs.
Xu, Chang; Nezami Ranjbar, Mohammad R; Wu, Zhong; DiCarlo, John; Wang, Yexun
2017-01-03
Detection of DNA mutations at very low allele fractions with high accuracy will significantly improve the effectiveness of precision medicine for cancer patients. To achieve this goal through next generation sequencing, researchers need a detection method that 1) captures rare mutation-containing DNA fragments efficiently in the mix of abundant wild-type DNA; 2) sequences the DNA library extensively to deep coverage; and 3) distinguishes low level true variants from amplification and sequencing errors with high accuracy. Targeted enrichment using PCR primers provides researchers with a convenient way to achieve deep sequencing for a small, yet most relevant region using benchtop sequencers. Molecular barcoding (or indexing) provides a unique solution for reducing sequencing artifacts analytically. Although different molecular barcoding schemes have been reported in recent literature, most variant calling has been done on limited targets, using simple custom scripts. The analytical performance of barcode-aware variant calling can be significantly improved by incorporating advanced statistical models. We present here a highly efficient, simple and scalable enrichment protocol that integrates molecular barcodes in multiplex PCR amplification. In addition, we developed smCounter, an open source, generic, barcode-aware variant caller based on a Bayesian probabilistic model. smCounter was optimized and benchmarked on two independent read sets with SNVs and indels at 5 and 1% allele fractions. Variants were called with very good sensitivity and specificity within coding regions. We demonstrated that we can accurately detect somatic mutations with allele fractions as low as 1% in coding regions using our enrichment protocol and variant caller.
Martelat, Benoit; Isnard, Helene; Vio, Laurent; Dupuis, Erwan; Cornet, Terence; Nonell, Anthony; Chartier, Frederic
2018-06-22
Precise isotopic and elemental characterization of spent nuclear fuel is a major concern for the validation of the neutronic calculation codes and waste management strategy in the nuclear industry. Generally, the elements of interest, particularly U and Pu which are the two major elements present in spent fuel, are purified by ion exchange or extractant resins before off-line measurements by thermal ionization mass spectrometry. The aim of the present work was to develop a new analytical approach based on capillary electrophoresis (CE) hyphenated to a multicollector inductively coupled plasma mass spectrometer (MC-ICPMS) for online isotope ratio measurements. An electrophoretic separation protocol of U, Pu and the fraction containing fission products and minor actinides (Am and Cm) was developed using acetic acid as the electrolyte and complexing agent. The instrumentation for CE was designed to be used in a glove box and a laboratory-built interface was developed for hyphenation with MC-ICPMS. The separation was realized with only a few nL of a solution of spent nuclear fuel and the reproducibilities obtained on the U and Pu isotope ratios were on the order of a few ‰ which is comparable to those obtained by thermal ionization mass spectrometer (TIMS). This innovative protocol allowed a tremendous reduction of the analyte masses from μg to ng and also a drastic reduction of the liquid waste production from mL to μL. In addition, the time of analysis was shorted by at least a factor three. All of these improved parameters are of major interest for nuclear applications.
Spectrophotometric Analysis of Phenolic Compounds in Grapes and Wines.
Aleixandre-Tudo, Jose Luis; Buica, Astrid; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel
2017-05-24
Phenolic compounds are of crucial importance for red wine color and mouthfeel attributes. A large number of enzymatic and chemical reactions involving phenolic compounds take place during winemaking and aging. Despite the large number of published analytical methods for phenolic analyses, the values obtained may vary considerably. In addition, the existing scientific knowledge needs to be updated, but also critically evaluated and simplified for newcomers and wine industry partners. The most used and widely cited spectrophotometric methods for grape and wine phenolic analysis were identified through a bibliometric search using the Science Citation Index-Expanded (SCIE) database accessed through the Web of Science (WOS) platform from Thompson Reuters. The selection of spectrophotometry was based on its ease of use as a routine analytical technique. On the basis of the number of citations, as well as the advantages and disadvantages reported, the modified Somers assay appears as a multistep, simple, and robust procedure that provides a good estimation of the state of the anthocyanins equilibria. Precipitation methods for total tannin levels have also been identified as preferred protocols for these types of compounds. Good reported correlations between methods (methylcellulose precipitable vs bovine serum albumin) and between these and perceived red wine astringency, in combination with the adaptation to high-throughput format, make them suitable for routine analysis. The bovine serum albumin tannin assay also allows for the estimation of the anthocyanins content with the measurement of small and large polymeric pigments. Finally, the measurement of wine color using the CIELab space approach is also suggested as the protocol of choice as it provides good insight into the wine's color properties.
Tanase, Maya; Zolla, Valerio; Clement, Cristina C; Borghi, Francesco; Urbanska, Aleksandra M; Rodriguez-Navarro, Jose Antonio; Roda, Barbara; Zattoni, Andrea; Reschiglian, Pierluigi; Cuervo, Ana Maria; Santambrogio, Laura
2016-01-01
Herein we describe a protocol that uses hollow-fiber flow field-flow fractionation (FFF) coupled with multiangle light scattering (MALS) for hydrodynamic size-based separation and characterization of complex protein aggregates. The fractionation method, which requires 1.5 h to run, was successfully modified from the analysis of protein aggregates, as found in simple protein mixtures, to complex aggregates, as found in total cell lysates. In contrast to other related methods (filter assay, analytical ultracentrifugation, gel electrophoresis and size-exclusion chromatography), hollow-fiber flow FFF coupled with MALS allows a flow-based fractionation of highly purified protein aggregates and simultaneous measurement of their molecular weight, r.m.s. radius and molecular conformation (e.g., round, rod-shaped, compact or relaxed). The polyethersulfone hollow fibers used, which have a 0.8-mm inner diameter, allow separation of as little as 20 μg of total cell lysates. In addition, the ability to run the samples in different denaturing and nondenaturing buffer allows defining true aggregates from artifacts, which can form during sample preparation. The protocol was set up using Paraquat-induced carbonylation, a model that induces protein aggregation in cultured cells. This technique will advance the biochemical, proteomic and biophysical characterization of molecular-weight aggregates associated with protein mutations, as found in many CNS degenerative diseases, or chronic oxidative stress, as found in aging, and chronic metabolic and inflammatory conditions. PMID:25521790
McDade, Thomas W; Williams, Sharon; Snodgrass, J Josh
2007-11-01
Logistical constraints associated with the collection and analysis of biological samples in community-based settings have been a significant impediment to integrative, multilevel bio-demographic and biobehavioral research. However recent methodological developments have overcome many of these constraints and have also expanded the options for incorporating biomarkers into population-based health research in international as well as domestic contexts. In particular using dried blood spot (DBS) samples-drops of whole blood collected on filter paper from a simple finger prick-provides a minimally invasive method for collecting blood samples in nonclinical settings. After a brief discussion of biomarkers more generally, we review procedures for collecting, handling, and analyzing DBS samples. Advantages of using DBS samples-compared with venipuncture include the relative ease and low cost of sample collection, transport, and storage. Disadvantages include requirements for assay development and validation as well as the relatively small volumes of sample. We present the results of a comprehensive literature review of published protocols for analysis of DBS samples, and we provide more detailed analysis of protocols for 45 analytes likely to be of particular relevance to population-level health research. Our objective is to provide investigators with the information they need to make informed decisions regarding the appropriateness of blood spot methods for their research interests.
Nie, Shuai; Benito-Peña, Elena; Zhang, Huaibin; Wu, Yue; Walt, David R
2013-10-10
Herein, we describe a protocol for simultaneously measuring six proteins in saliva using a fiber-optic microsphere-based antibody array. The immuno-array technology employed combines the advantages of microsphere-based suspension array fabrication with the use of fluorescence microscopy. As described in the video protocol, commercially available 4.5 μm polymer microspheres were encoded into seven different types, differentiated by the concentration of two fluorescent dyes physically trapped inside the microspheres. The encoded microspheres containing surface carboxyl groups were modified with monoclonal capture antibodies through EDC/NHS coupling chemistry. To assemble the protein microarray, the different types of encoded and functionalized microspheres were mixed and randomly deposited in 4.5 μm microwells, which were chemically etched at the proximal end of a fiber-optic bundle. The fiber-optic bundle was used as both a carrier and for imaging the microspheres. Once assembled, the microarray was used to capture proteins in the saliva supernatant collected from the clinic. The detection was based on a sandwich immunoassay using a mixture of biotinylated detection antibodies for different analytes with a streptavidin-conjugated fluorescent probe, R-phycoerythrin. The microarray was imaged by fluorescence microscopy in three different channels, two for microsphere registration and one for the assay signal. The fluorescence micrographs were then decoded and analyzed using a homemade algorithm in MATLAB.
Gated Silica Mesoporous Materials in Sensing Applications.
Sancenón, Félix; Pascual, Lluís; Oroval, Mar; Aznar, Elena; Martínez-Máñez, Ramón
2015-08-01
Silica mesoporous supports (SMSs) have a large specific surface area and volume and are particularly exciting vehicles for delivery applications. Such container-like structures can be loaded with numerous different chemical substances, such as drugs and reporters. Gated systems also contain addressable functions at openings of voids, and cargo delivery can be controlled on-command using chemical, biochemical or physical stimuli. Many of these gated SMSs have been applied for drug delivery. However, fewer examples of their use in sensing protocols have been reported. The approach of applying SMSs in sensing uses another concept-that of loading pores with a reporter and designing a capping mechanism that is selectively opened in the presence of a target analyte, which results in the delivery of the reporter. According to this concept, we provide herein a complete compilation of published examples of probes based on the use of capped SMSs for sensing. Examples for the detection of anions, cations, small molecules and biomolecules are provided. The diverse range of gated silica mesoporous materials presented here highlights their usefulness in recognition protocols.
Gated Silica Mesoporous Materials in Sensing Applications
Sancenón, Félix; Pascual, Lluís; Oroval, Mar; Aznar, Elena; Martínez-Máñez, Ramón
2015-01-01
Silica mesoporous supports (SMSs) have a large specific surface area and volume and are particularly exciting vehicles for delivery applications. Such container-like structures can be loaded with numerous different chemical substances, such as drugs and reporters. Gated systems also contain addressable functions at openings of voids, and cargo delivery can be controlled on-command using chemical, biochemical or physical stimuli. Many of these gated SMSs have been applied for drug delivery. However, fewer examples of their use in sensing protocols have been reported. The approach of applying SMSs in sensing uses another concept—that of loading pores with a reporter and designing a capping mechanism that is selectively opened in the presence of a target analyte, which results in the delivery of the reporter. According to this concept, we provide herein a complete compilation of published examples of probes based on the use of capped SMSs for sensing. Examples for the detection of anions, cations, small molecules and biomolecules are provided. The diverse range of gated silica mesoporous materials presented here highlights their usefulness in recognition protocols. PMID:26491626
Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M
2012-11-08
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.
Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi
2013-08-01
Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.
Zou, Lili; Shen, Kaini; Zhong, Dingrong; Zhou, Daobin; Sun, Wei; Li, Jian
2015-01-01
Laser microdissection followed by mass spectrometry has been successfully used for amyloid typing. However, sample contamination can interfere with proteomic analysis, and overnight digestion limits the analytical throughput. Moreover, current quantitative analysis methods are based on the spectrum count, which ignores differences in protein length and may lead to misdiagnoses. Here, we developed a microwave-assisted filter-aided sample preparation (maFASP) method that can efficiently remove contaminants with a 10-kDa cutoff ultrafiltration unit and can accelerate the digestion process with the assistance of a microwave. Additionally, two parameters (P- and D-scores) based on the exponentially modified protein abundance index were developed to define the existence of amyloid deposits and those causative proteins with the greatest abundance. Using our protocol, twenty cases of systemic amyloidosis that were well-typed according to clinical diagnostic standards (training group) and another twenty-four cases without subtype diagnoses (validation group) were analyzed. Using this approach, sample preparation could be completed within four hours. We successfully subtyped 100% of the cases in the training group, and the diagnostic success rate in the validation group was 91.7%. This maFASP-aided proteomic protocol represents an efficient approach for amyloid diagnosis and subtyping, particularly for serum-contaminated samples. PMID:25984759
NASA Astrophysics Data System (ADS)
Morschheuser, Lena; Wessels, Hauke; Pille, Christina; Fischer, Judith; Hünniger, Tim; Fischer, Markus; Paschke-Kratzin, Angelika; Rohn, Sascha
2016-05-01
Protein analysis using high-performance thin-layer chromatography (HPTLC) is not commonly used but can complement traditional electrophoretic and mass spectrometric approaches in a unique way. Due to various detection protocols and possibilities for hyphenation, HPTLC protein analysis is a promising alternative for e.g., investigating posttranslational modifications. This study exemplarily focused on the investigation of lysozyme, an enzyme which is occurring in eggs and technologically added to foods and beverages such as wine. The detection of lysozyme is mandatory, as it might trigger allergenic reactions in sensitive individuals. To underline the advantages of HPTLC in protein analysis, the development of innovative, highly specific staining protocols leads to improved sensitivity for protein detection on HPTLC plates in comparison to universal protein derivatization reagents. This study aimed at developing a detection methodology for HPTLC separated proteins using aptamers. Due to their affinity and specificity towards a wide range of targets, an aptamer based staining procedure on HPTLC (HPTLC-aptastaining) will enable manifold analytical possibilities. Besides the proof of its applicability for the very first time, (i) aptamer-based staining of proteins is applicable on different stationary phase materials and (ii) furthermore, it can be used as an approach for a semi-quantitative estimation of protein concentrations.
Dolegowska, B; Ostapowicz, A; Stanczyk-Dunaj, M; Blogowski, W
2012-08-01
5-Fluorouracil (5-FU) is one of the most commonly used chemotherapeutics in the treatment of malignancies originating from breast, prostate, ovarian, skin and gastrointestinal tissues. Around 80% of administered dose of 5-FU is catabolized by dihydropirymidine dehydrogenase (DPD). Patients, in whom a deficiency or insufficient activity of this enzyme is observed, are at great risk of development of severe, even lethal, 5-FU toxicity. According to recent studies, so far over 30 mutations of DPYD gene, which are associated with DPD deficiency/insufficiency, have already been discovered. Currently, there are several analytical methods used for measurements of DPD activity. However, in this paper we report a novel, simple, economical and more accessible spectrophotometric method for measurements of DPD activity in the peripheral blood mononuclear cells (PBMCs) that was developed and validated on analysis of 200 generally healthy volunteers aged 22-63. We present two spectrophotometric protocols in this study, and as a reference method we used already described reverse phase high-performance liquid chromatography (RP HPLC) analysis. Basing on our findings, we conclude that spectrophotometric methods may be used as a screening protocol preceding 5-FU-based chemotherapy. Nevertheless, before introduction into clinical reality, our results should be confirmed in further larger studies.
Impact of the Injection Protocol on an Impurity's Stationary State
NASA Astrophysics Data System (ADS)
Gamayun, Oleksandr; Lychkovskiy, Oleg; Burovski, Evgeni; Malcomson, Matthew; Cheianov, Vadim V.; Zvonarev, Mikhail B.
2018-06-01
We examine stationary-state properties of an impurity particle injected into a one-dimensional quantum gas. We show that the value of the impurity's end velocity lies between zero and the speed of sound in the gas and is determined by the injection protocol. This way, the impurity's constant motion is a dynamically emergent phenomenon whose description goes beyond accounting for the kinematic constraints of the Landau approach to superfluidity. We provide exact analytic results in the thermodynamic limit and perform finite-size numerical simulations to demonstrate that the predicted phenomena are within the reach of the ultracold gas experiments.
Health-Enabled Smart Sensor Fusion Technology
NASA Technical Reports Server (NTRS)
Wang, Ray
2012-01-01
A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.
77 FR 15722 - Southern California Hook and Line Survey; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
... meeting to evaluate the Southern California Shelf Rockfish Hook and Line Survey which was designed to... and Line survey design and protocols; (2) examine the analytical methods used to generate rockfish... California Hook and Line Survey; Public Meeting AGENCY: National Marine Fisheries Service (NMFS), National...
VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR SEMI-VOLATILE ORGANIC COMPOUNDS
There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a s...
An Evaluative Methodology for Virtual Communities Using Web Analytics
ERIC Educational Resources Information Center
Phippen, A. D.
2004-01-01
The evaluation of virtual community usage and user behaviour has its roots in social science approaches such as interview, document analysis and survey. Little evaluation is carried out using traffic or protocol analysis. Business approaches to evaluating customer/business web site usage are more advanced, in particular using advanced web…
A Protocol-Analytic Study of Metacognition in Mathematical Problem Solving.
ERIC Educational Resources Information Center
Cai, Jinfa
1994-01-01
Metacognitive behaviors of subjects having high (n=2) and low (n=2) levels of mathematical experience were compared across four cognitive processes in mathematical problem solving: orientation, organization, execution, and verification. High-experience subjects engaged in self-regulation and spent more time on orientation and organization. (36…
The purpose of this presentation is to teach a course on analytical techniques, quality assurance, environmental research protocols, and basic soil environmental chemistry at the Environmental Health Center and Babes Bolyai University in Cluj, Romania. FOR FURTHER INFORMATI...
A Trio of Human Molecular Genetics PCR Assays
ERIC Educational Resources Information Center
Reinking, Jeffrey L.; Waldo, Jennifer T.; Dinsmore, Jannett
2013-01-01
This laboratory exercise demonstrates three different analytical forms of the polymerase chain reaction (PCR) that allow students to genotype themselves at four different loci. Here, we present protocols to allow students to a) genotype a non-coding polymorphic Variable Number of Tandem Repeat (VNTR) locus on human chromosome 5 using conventional…
Where Young People See Science: Everyday Activities Connected to Science
ERIC Educational Resources Information Center
Zimmerman, Heather Toomey; Bell, Philip
2014-01-01
This project analyses the prevalence and social construction of science in the everyday activities of multicultural, multilingual children in one urban community. Using cross-setting ethnographic fieldwork (i.e. home, museum, school, community), we developed an ecologically grounded interview protocol and analytical scheme for gauging students'…
A Very Low Power MAC (VLPM) Protocol for Wireless Body Area Networks
Ullah, Niamat; Khan, Pervez; Kwak, Kyung Sup
2011-01-01
Wireless Body Area Networks (WBANs) consist of a limited number of battery operated nodes that are used to monitor the vital signs of a patient over long periods of time without restricting the patient’s movements. They are an easy and fast way to diagnose the patient’s status and to consult the doctor. Device as well as network lifetime are among the most important factors in a WBAN. Prolonging the lifetime of the WBAN strongly depends on controlling the energy consumption of sensor nodes. To achieve energy efficiency, low duty cycle MAC protocols are used, but for medical applications, especially in the case of pacemakers where data have time-limited relevance, these protocols increase latency which is highly undesirable and leads to system instability. In this paper, we propose a low power MAC protocol (VLPM) based on existing wakeup radio approaches which reduce energy consumption as well as improving the response time of a node. We categorize the traffic into uplink and downlink traffic. The nodes are equipped with both a low power wake-up transmitter and receiver. The low power wake-up receiver monitors the activity on channel all the time with a very low power and keeps the MCU (Micro Controller Unit) along with main radio in sleep mode. When a node [BN or BNC (BAN Coordinator)] wants to communicate with another node, it uses the low-power radio to send a wakeup packet, which will prompt the receiver to power up its primary radio to listen for the message that follows shortly. The wake-up packet contains the desired node’s ID along with some other information to let the targeted node to wake-up and take part in communication and let all other nodes to go to sleep mode quickly. The VLPM protocol is proposed for applications having low traffic conditions. For high traffic rates, optimization is needed. Analytical results show that the proposed protocol outperforms both synchronized and unsynchronized MAC protocols like T-MAC, SCP-MAC, B-MAC and X-MAC in terms of energy consumption and response time. PMID:22163818
Breast dosimetry in clinical mammography
NASA Astrophysics Data System (ADS)
Benevides, Luis Alberto Do Rego
The objective of this study was show that a clinical dosimetry protocol that utilizes a dosimetric breast phantom series based on population anthropometric measurements can reliably predict the average glandular dose (AGD) imparted to the patient during a routine screening mammogram. In the study, AGD was calculated using entrance skin exposure and dose conversion factors based on fibroglandular content, compressed breast thickness, mammography unit parameters and modifying parameters for homogeneous phantom (phantom factor), compressed breast lateral dimensions (volume factor) and anatomical features (anatomical factor). The protocol proposes the use of a fiber-optic coupled (FOCD) or Metal Oxide Semiconductor Field Effect Transistor (MOSFET) dosimeter to measure the entrance skin exposure at the time of the mammogram without interfering with diagnostic information of the mammogram. The study showed that FOCD had sensitivity with less than 7% energy dependence, linear in all tube current-time product stations, and was reproducible within 2%. FOCD was superior to MOSFET dosimeter in sensitivity, reusability, and reproducibility. The patient fibroglandular content was evaluated using a calibrated modified breast tissue equivalent homogeneous phantom series (BRTES-MOD) designed from anthropomorphic measurements of a screening mammography population and whose elemental composition was referenced to International Commission on Radiation Units and Measurements Report 44 tissues. The patient fibroglandular content, compressed breast thickness along with unit parameters and spectrum half-value layer were used to derive the currently used dose conversion factor (DgN). The study showed that the use of a homogeneous phantom, patient compressed breast lateral dimensions and patient anatomical features can affect AGD by as much as 12%, 3% and 1%, respectively. The protocol was found to be superior to existing methodologies. In addition, the study population anthropometric measurements enabled the development of analytical equations to calculate the whole breast area, estimate for the skin layer thickness and optimal location for automatic exposure control ionization chamber. The clinical dosimetry protocol developed in this study can reliably predict the AGD imparted to an individual patient during a routine screening mammogram.
Jordan, Joanne; Rose, Louise; Dainty, Katie N; Noyes, Jane; Blackwood, Bronagh
2016-10-04
Prolonged mechanical ventilation is associated with a longer intensive care unit (ICU) length of stay and higher mortality. Consequently, methods to improve ventilator weaning processes have been sought. Two recent Cochrane systematic reviews in ICU adult and paediatric populations concluded that protocols can be effective in reducing the duration of mechanical ventilation, but there was significant heterogeneity in study findings. Growing awareness of the benefits of understanding the contextual factors impacting on effectiveness has encouraged the integration of qualitative evidence syntheses with effectiveness reviews, which has delivered important insights into the reasons underpinning (differential) effectiveness of healthcare interventions. 1. To locate, appraise and synthesize qualitative evidence concerning the barriers and facilitators of the use of protocols for weaning critically-ill adults and children from mechanical ventilation;2. To integrate this synthesis with two Cochrane effectiveness reviews of protocolized weaning to help explain observed heterogeneity by identifying contextual factors that impact on the use of protocols for weaning critically-ill adults and children from mechanical ventilation;3. To use the integrated body of evidence to suggest the circumstances in which weaning protocols are most likely to be used. We used a range of search terms identified with the help of the SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) mnemonic. Where available, we used appropriate methodological filters for specific databases. We searched the following databases: Ovid MEDLINE, Embase, OVID, PsycINFO, CINAHL Plus, EBSCOHost, Web of Science Core Collection, ASSIA, IBSS, Sociological Abstracts, ProQuest and LILACS on the 26th February 2015. In addition, we searched: the grey literature; the websites of professional associations for relevant publications; and the reference lists of all publications reviewed. We also contacted authors of the trials included in the effectiveness reviews as well as of studies (potentially) included in the qualitative synthesis, conducted citation searches of the publications reporting these studies, and contacted content experts.We reran the search on 3rd July 2016 and found three studies, which are awaiting classification. We included qualitative studies that described: the circumstances in which protocols are designed, implemented or used, or both, and the views and experiences of healthcare professionals either involved in the design, implementation or use of weaning protocols or involved in the weaning of critically-ill adults and children from mechanical ventilation not using protocols. We included studies that: reflected on any aspect of the use of protocols, explored contextual factors relevant to the development, implementation or use of weaning protocols, and reported contextual phenomena and outcomes identified as relevant to the effectiveness of protocolized weaning from mechanical ventilation. At each stage, two review authors undertook designated tasks, with the results shared amongst the wider team for discussion and final development. We independently reviewed all retrieved titles, abstracts and full papers for inclusion, and independently extracted selected data from included studies. We used the findings of the included studies to develop a new set of analytic themes focused on the barriers and facilitators to the use of protocols, and further refined them to produce a set of summary statements. We used the Confidence in the Evidence from Reviews of Qualitative Research (CERQual) framework to arrive at a final assessment of the overall confidence of the evidence used in the synthesis. We included all studies but undertook two sensitivity analyses to determine how the removal of certain bodies of evidence impacted on the content and confidence of the synthesis. We deployed a logic model to integrate the findings of the qualitative evidence synthesis with those of the Cochrane effectiveness reviews. We included 11 studies in our synthesis, involving 267 participants (one study did not report the number of participants). Five more studies are awaiting classification and will be dealt with when we update the review.The quality of the evidence was mixed; of the 35 summary statements, we assessed 17 as 'low', 13 as 'moderate' and five as 'high' confidence. Our synthesis produced nine analytical themes, which report potential barriers and facilitators to the use of protocols. The themes are: the need for continual staff training and development; clinical experience as this promotes felt and perceived competence and confidence to wean; the vulnerability of weaning to disparate interprofessional working; an understanding of protocols as militating against a necessary proactivity in clinical practice; perceived nursing scope of practice and professional risk; ICU structure and processes of care; the ability of protocols to act as a prompt for shared care and consistency in weaning practice; maximizing the use of protocols through visibility and ease of implementation; and the ability of protocols to act as a framework for communication with parents. There is a clear need for weaning protocols to take account of the social and cultural environment in which they are to be implemented. Irrespective of its inherent strengths, a protocol will not be used if it does not accommodate these complexities. In terms of protocol development, comprehensive interprofessional input will help to ensure broad-based understanding and a sense of 'ownership'. In terms of implementation, all relevant ICU staff will benefit from general weaning as well as protocol-specific training; not only will this help secure a relevant clinical knowledge base and operational understanding, but will also demonstrate to others that this knowledge and understanding is in place. In order to maximize relevance and acceptability, protocols should be designed with the patient profile and requirements of the target ICU in mind. Predictably, an under-resourced ICU will impact adversely on protocol implementation, as staff will prioritize management of acutely deteriorating and critically-ill patients.
Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang
2013-07-25
Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Quantification of trace elements and speciation of iron in atmospheric particulate matter
NASA Astrophysics Data System (ADS)
Upadhyay, Nabin
Trace metal species play important roles in atmospheric redox processes and in the generation of oxidants in cloud systems. The chemical impact of these elements on atmospheric and cloud chemistry is dependent on their occurrence, solubility and speciation. First, analytical protocols have been developed to determine trace elements in particulate matter samples collected for carbonaceous analysis. The validated novel protocols were applied to the determination of trace elements in particulate samples collected in the remote marine atmosphere and urban areas in Arizona to study air pollution issues. The second part of this work investigates on solubility and speciation in environmental samples. A detailed study on the impact of the nature and strength of buffer solutions on solubility and speciation of iron lead to a robust protocol, allowing for comparative measurements in matrices representative of cloud water conditions. Application of this protocol to samples from different environments showed low iron solubility (less than 1%) in dust-impacted events and higher solubility (5%) in anthropogenically impacted urban samples. In most cases, Fe(II) was the dominant oxidation state in the soluble fraction of iron. The analytical protocol was then applied to investigate iron processing by fogs. Field observations showed that only a small fraction (1%) of iron was scavenged by fog droplets for which each of the soluble and insoluble fraction were similar. A coarse time resolution limited detailed insights into redox cycling within fog system. Overall results suggested that the major iron species in the droplets was Fe(1I) (80% of soluble iron). Finally, the occurrence and sources of emerging organic pollutants in the urban atmosphere were investigated. Synthetic musk species are ubiquitous in the urban environment (less than 5 ng m-3) and investigations at wastewater treatment plants showed that wastewater aeration basins emit a substantial amount of these species to the atmosphere.
Exact results for the Floquet coin toss for driven integrable models
NASA Astrophysics Data System (ADS)
Bhattacharya, Utso; Maity, Somnath; Banik, Uddipan; Dutta, Amit
2018-05-01
We study an integrable Hamiltonian reducible to free fermions, which is subjected to an imperfect periodic driving with the amplitude of driving (or kicking), randomly chosen from a binary distribution like a coin-toss problem. The randomness present in the driving protocol destabilizes the periodic steady state reached in the limit of perfectly periodic driving, leading to a monotonic rise of the stroboscopic residual energy with the number of periods (N ) for such Hamiltonians. We establish that a minimal deviation from the perfectly periodic driving in the present case using such protocols would always result in a bounded heating up of the system with N to an asymptotic finite value. Exploiting the completely uncorrelated nature of the randomness and the knowledge of the stroboscopic Floquet operator in the perfectly periodic situation, we provide an exact analytical formalism to derive the disorder averaged expectation value of the residual energy through a disorder operator. This formalism not only leads to an immense numerical simplification, but also enables us to derive an exact analytical form for the residual energy in the asymptotic limit which is universal, i.e., independent of the bias of coin-toss and the protocol chosen. Furthermore, this formalism clearly establishes the nature of the monotonic growth of the residual energy at intermediate N while clearly revealing the possible nonuniversal behavior of the same.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cutler, Dylan; Frank, Stephen; Slovensky, Michelle
Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less
DNA barcode-based delineation of putative species: efficient start for taxonomic workflows
Kekkonen, Mari; Hebert, Paul D N
2014-01-01
The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435
Mensah, Mavis; Borzi, Cristina; Verri, Carla; Suatoni, Paola; Conte, Davide; Pastorino, Ugo; Orazio, Fortunato; Sozzi, Gabriella; Boeri, Mattia
2017-10-26
The development of a minimally invasive test, such as liquid biopsy, for early lung cancer detection in its preclinical phase is crucial to improve the outcome of this deadly disease. MicroRNAs (miRNAs) are tissue specific, small, non-coding RNAs regulating gene expression, which may act as extracellular messengers of biological signals derived from the cross-talk between the tumor and its surrounding microenvironment. They could thus represent ideal candidates for early detection of lung cancer. In this work, a methodological workflow for the prospective validation of a circulating miRNA test using custom made microfluidic cards and quantitative Real-Time PCR in plasma samples of volunteers enrolled in a lung cancer screening trial is proposed. In addition, since the release of hemolysis-related miRNAs and more general technical issues may affect the analysis, the quality control steps included in the standard operating procedures are also presented. The protocol is reproducible and gives reliable quantitative results; however, when using large clinical series, both pre-analytical and analytical features should be cautiously evaluated.
Detection of Salmonella enterica Serovar Typhimurium by Using a Rapid, Array-Based Immunosensor
Taitt, Chris Rowe; Shubin, Yura S.; Angel, Roselina; Ligler, Frances S.
2004-01-01
The multianalyte array biosensor (MAAB) is a rapid analysis instrument capable of detecting multiple analytes simultaneously. Rapid (15-min), single-analyte sandwich immunoassays were developed for the detection of Salmonella enterica serovar Typhimurium, with a detection limit of 8 × 104 CFU/ml; the limit of detection was improved 10-fold by lengthening the assay protocol to 1 h. S. enterica serovar Typhimurium was also detected in the following spiked foodstuffs, with minimal sample preparation: sausage, cantaloupe, whole liquid egg, alfalfa sprouts, and chicken carcass rinse. Cross-reactivity tests were performed with Escherichia coli and Campylobacter jejuni. To determine whether the MAAB has potential as a screening tool for the diagnosis of asymptomatic Salmonella infection of poultry, chicken excretal samples from a private, noncommercial farm and from university poultry facilities were tested. While the private farm excreta gave rise to signals significantly above the buffer blanks, none of the university samples tested positive for S. enterica serovar Typhimurium without spiking; dose-response curves of spiked excretal samples from university-raised poultry gave limits of detection of 8 × 103 CFU/g. PMID:14711637
Llorente Ballesteros, M T; Navarro Serrano, I; López Colón, J L
2015-01-01
The aim of this report is to propose a scheme for validation of an analytical technique according to ISO 17025. According to ISO 17025, the fundamental parameters tested were: selectivity, calibration model, precision, accuracy, uncertainty of measurement, and analytical interference. A protocol has been developed that has been applied successfully to quantify zinc in serum by atomic absorption spectrometry. It is demonstrated that our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Münker, Carsten; Strub, Erik
2017-01-01
The 138La–138Ce decay system (half-life 1.02 × 1011 years) is a potentially highly useful tool to unravel information about the timing of geological processes and about the interaction of geological reservoirs on earth, complementing information from the more popular 147Sm–143Nd and 176Lu–176Hf isotope systems. Previously published analytical protocols were limited to TIMS. Here we present for the first time an analytical protocol that employs MC-ICPMS, with an improved precision and sensitivity. To perform sufficiently accurate La–Ce measurements, an efficient ion-chromatographic procedure is required to separate Ce from the other rare earth elements (REE) and Ba quantitatively. This study presents an improved ion-chromatographic procedure that separates La and Ce from rock samples using a three-step column separation. After REE separation by cation exchange, Ce is separated employing an Ln Spec column and selective oxidation. In the last step, a cation clean-up chemistry is performed to remove all remaining interferences. Our MC-ICPMS measurement protocol includes all stable Ce isotopes (136Ce, 138Ce, 140Ce and 142Ce), by employing a 1010 ohm amplifier for the most abundant isotope 140Ce. An external reproducibility of ±0.25ε-units (2 r.s.d) has been routinely achieved for 138Ce measurements for as little as 150–600 ng Ce, depending on the sample–skimmer cone combinations being used. Because the traditionally used JMC-304 Ce reference material is not commercially available anymore, a new reference material was prepared from AMES laboratory Ce metal (Cologne-AMES). In order to compare the new material with the previously reported isotopic composition of AMES material prepared at Mainz (Mainz-AMES), Cologne-AMES and JMC-304 were measured relative to each other in the same analytical session, demonstrating isotope heterogeneity between the two AMES and different JMC-304 batches used in the literature. To enable sufficiently precise age correction of radiogenic 138Ce and to perform isochron dating, a protocol was developed where La and Ce concentrations are determined by isotope dilution (ID), using an isotope tracer enriched in 138La and 142Ce. The new protocols were applied to determine the variations of Ce isotope compositions and La–Ce concentrations of certified geochemical reference materials (CRMs): BCR-2, BCR-1, BHVO-2, JR-1, JA-2, JB-3, JG-1, JR-1, JB-1b, AGV-1 and one in-house La Palma standard. PMID:29456283
Guo, Xiangyu; Bai, Hua; Lv, Yueguang; Xi, Guangcheng; Li, Junfang; Ma, Xiaoxiao; Ren, Yue; Ouyang, Zheng; Ma, Qiang
2018-04-01
Rapid, on-site analysis was achieved through significantly simplified operation procedures for a wide variety of toy samples (crayon, temporary tattoo sticker, finger paint, modeling clay, and bubble solution) using a miniature mass spectrometry system with ambient ionization capability. The labor-intensive analytical protocols involving sample workup and chemical separation, traditionally required for MS-based analysis, were replaced by direct sampling analysis using ambient ionization methods. A Mini β ion trap miniature mass spectrometer was coupled with versatile ambient ionization methods, e.g. paper spray, extraction spray and slug-flow microextraction nanoESI for direct identification of prohibited colorants, carcinogenic primary aromatic amines, allergenic fragrances, preservatives and plasticizers from raw toy samples. The use of paper substrates coated with Co 3 O 4 nanoparticles allowed a great increase in sensitivity for paper spray. Limits of detection as low as 5μgkg -1 were obtained for target analytes. The methods being developed based on the integration of ambient ionization with miniature mass spectrometer represent alternatives to current in-lab MS analysis operation, and would enable fast, outside-the-lab screening of toy products to ensure children's safety and health. Copyright © 2017 Elsevier B.V. All rights reserved.
Direct Analysis in Real Time Mass Spectrometry for Characterization of Large Saccharides.
Ma, Huiying; Jiang, Qing; Dai, Diya; Li, Hongli; Bi, Wentao; Da Yong Chen, David
2018-03-06
Polysaccharide characterization posts the most difficult challenge to available analytical technologies compared to other types of biomolecules. Plant polysaccharides are reported to have numerous medicinal values, but their effect can be different based on the types of plants, and even regions of productions and conditions of cultivation. However, the molecular basis of the differences of these polysaccharides is largely unknown. In this study, direct analysis in real time mass spectrometry (DART-MS) was used to generate polysaccharide fingerprints. Large saccharides can break down into characteristic small fragments in the DART source via pyrolysis, and the products are then detected by high resolution MS. Temperature was shown to be a crucial parameter for the decomposition of large polysaccharide. The general behavior of carbohydrates in DART-MS was also studied through the investigation of a number of mono- and oligosaccharide standards. The chemical formula and putative ionic forms of the fragments were proposed based on accurate mass with less than 10 ppm mass errors. Multivariate data analysis shows the clear differentiation of different plant species. Intensities of marker ions compared among samples also showed obvious differences. The combination of DART-MS analysis and mechanochemical extraction method used in this work demonstrates a simple, fast, and high throughput analytical protocol for the efficient evaluation of molecular features in plant polysaccharides.
VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR ...
There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a system with the ability to prepare and quickly analyze a large number of contaminated samples for the traditional chemical agents, as well as numerous toxic industrial chemicals. Liquid samples (both aqueous and organic), solid samples (e.g., soil), vapor samples (e.g., air) and mixed state samples, all ranging from household items to deceased animals, may require some level of analyses. To meet this challenge, the U.S. Environmental Protection Agency (U.S. EPA) National Homeland Security Research Center, in collaboration with experts from across U.S. EPA and other Federal Agencies, initiated an effort to identify analytical methods for the chemical and biological agents that could be used to respond to a terrorist attack or a homeland security incident. U.S. EPA began development of standard analytical protocols (SAPs) for laboratory identification and measurement of target agents in case of a contamination threat. These methods will be used to help assist in the identification of existing contamination, the effectiveness of decontamination, as well as clearance for the affected population to reoccupy previously contaminated areas. One of the first SAPs developed was for the determin
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Geochemical and mineralogical data for soils of the conterminous United States
Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Solano, Federico; Kilburn, James E.; Fey, David L.
2013-01-01
In 2007, the U.S. Geological Survey initiated a low-density (1 site per 1,600 square kilometers, 4,857 sites) geochemical and mineralogical survey of soils of the conterminous United States as part of the North American Soil Geochemical Landscapes Project. Sampling and analytical protocols were developed at a workshop in 2003, and pilot studies were conducted from 2004 to 2007 to test and refine these recommended protocols. The final sampling protocol for the national-scale survey included, at each site, a sample from a depth of 0 to 5 centimeters, a composite of the soil A horizon, and a deeper sample from the soil C horizon or, if the top of the C horizon was at a depth greater than 1 meter, from a depth of approximately 80–100 centimeters. The <2-millimeter fraction of each sample was analyzed for a suite of 45 major and trace elements by methods that yield the total or near-total elemental content. The major mineralogical components in the samples from the soil A and C horizons were determined by a quantitative X-ray diffraction method using Rietveld refinement. Sampling in the conterminous United States was completed in 2010, with chemical and mineralogical analyses completed in May 2013. The resulting dataset provides an estimate of the abundance and spatial distribution of chemical elements and minerals in soils of the conterminous United States and represents a baseline for soil geochemistry and mineralogy against which future changes may be recognized and quantified. This report (1) describes the sampling, sample preparation, and analytical methods used; (2) gives details of the quality control protocols used to monitor the quality of chemical and mineralogical analyses over approximately six years; and (3) makes available the soil geochemical and mineralogical data in downloadable tables.
CT protocol management: simplifying the process by using a master protocol concept
Bour, Robert K.; Rubert, Nicholas; Wendt, Gary; Pozniak, Myron; Ranallo, Frank N.
2015-01-01
This article explains a method for creating CT protocols for a wide range of patient body sizes and clinical indications, using detailed tube current information from a small set of commonly used protocols. Analytical expressions were created relating CT technical acquisition parameters which can be used to create new CT protocols on a given scanner or customize protocols from one scanner to another. Plots of mA as a function of patient size for specific anatomical regions were generated and used to identify the tube output needs for patients as a function of size for a single master protocol. Tube output data were obtained from the DICOM header of clinical images from our PACS and patient size was measured from CT localizer radiographs under IRB approval. This master protocol was then used to create 11 additional master protocols. The 12 master protocols were further combined to create 39 single and multiphase clinical protocols. Radiologist acceptance rate of exams scanned using the clinical protocols was monitored for 12,857 patients to analyze the effectiveness of the presented protocol management methods using a two‐tailed Fisher's exact test. A single routine adult abdominal protocol was used as the master protocol to create 11 additional master abdominal protocols of varying dose and beam energy. Situations in which the maximum tube current would have been exceeded are presented, and the trade‐offs between increasing the effective tube output via 1) decreasing pitch, 2) increasing the scan time, or 3) increasing the kV are discussed. Out of 12 master protocols customized across three different scanners, only one had a statistically significant acceptance rate that differed from the scanner it was customized from. The difference, however, was only 1% and was judged to be negligible. All other master protocols differed in acceptance rate insignificantly between scanners. The methodology described in this paper allows a small set of master protocols to be adapted among different clinical indications on a single scanner and among different CT scanners. PACS number: 87.57.Q PMID:26219005
NASA Astrophysics Data System (ADS)
Duchoslav, Jiri; Kehrer, Matthias; Hinterreiter, Andreas; Duchoslav, Vojtech; Unterweger, Christoph; Fürst, Christian; Steinberger, Roland; Stifter, David
2018-06-01
In the current work, chemical derivatization of amine (NH2) groups with trifluoroacetic anhydride (TFAA) as an analytical method to improve the information scope of X-ray photoelectron spectroscopy (XPS) is investigated. TFAA is known to successfully label hydroxyl (OH) groups. With the introduction of a newly developed gas-phase derivatization protocol conducted at ambient pressure and using a catalyst also NH2 groups can now efficiently be labelled with a high yield and without the formation of unwanted by-products. By establishing a comprehensive and self-consistent database of reference binding energies for XPS a promising approach for distinguishing hydroxyl from amine groups is presented. The protocol was verified on different polymers, including poly(allylamine), poly(ethyleneimine), poly(vinylalcohol) and chitosan, the latter one containing both types of addressed chemical groups.
Nindl, Bradley C; Alemany, Joseph A; Rarick, Kevin R; Eagle, Shawn R; Darnell, Mathew E; Allison, Katelyn F; Harman, Everett A
2017-02-01
The purpose of this study was to: 1) evaluate differential responses of the IGF-I system to either a calisthenic- or resistance exercise-based program and 2) determine if this chronic training altered the IGF-I system during an acute resistance exercise protocol. Thirty-two volunteers were randomly assigned into a resistance exercise-based training (RT) group (n=15, 27±5y, 174±6cm, 81±12kg) or a calisthenic-based training group (CT) (n=17, 29±5y, 179±8cm, 85±10kg) and all underwent 8weeks of exercise training (1.5h/d, 5d/wk). Basal blood was sampled pre- (Week 0), mid- (Week 4) and post-training (Week 8) and assayed for IGF-I system analytes. An acute resistance exercise protocol (AREP) was conducted preand post-training consisting of 6 sets of 10 repetitions in the squat with two minutes of rest in between sets and the IGF-I system analytes measured. A repeated measures ANOVA (p≤0.05) was used for statistical analysis. No interaction or within-subject effects were observed for basal total IGF-I, free IGF-I, or IGFBP-1. IGFBP-2 (pre; 578.6±295.7
Outgassing and dimensional changes of polymer matrix composites in space
NASA Technical Reports Server (NTRS)
Tennyson, R. C.; Matthews, R.
1993-01-01
A thermal-vacuum outgassing model and test protocol for predicting outgassing times and dimensional changes for polymer matrix composites is described. Experimental results derived from a 'control' sample are used to provide the basis for analytical predictions to compare with the outgassing response of Long Duration Exposure Facility (LDEF) flight samples.
A new method for toxaphene and toxaphene congener determination has been proposed by OSW as the response to an internal report from the OIG relative to toxaphene determination. In the course of this development, ORD was asked to prepare a new GC/NIMS protocol for 8081 analytes t...
Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George
2013-05-01
The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.
Garcia, Jessica; Dusserre, Eric; Cheynet, Valérie; Bringuier, Pierre Paul; Brengle-Pesce, Karen; Wozny, Anne-Sophie; Rodriguez-Lafrasse, Claire; Freyer, Gilles; Brevet, Marie; Payen, Léa; Couraud, Sébastien
2017-01-01
Non invasive somatic detection assays are suitable for repetitive tumor characterization or for detecting the appearance of somatic resistance during lung cancer. Molecular diagnosis based on circulating free DNA (cfDNA) offers the opportunity to track the genomic evolution of the tumor, and was chosen to assess the molecular profile of several EGFR alterations, including deletions in exon 19 (delEX19), the L858R substitution on exon 21 and the EGFR resistance mutation T790M on exon 20. Our study aimed at determining optimal pre-analytical conditions and EGFR mutation detection assays for analyzing cfDNA using the picoliter-droplet digital polymerase chain reaction (ddPCR) assay. Within the framework of the CIRCAN project set-up at the Lyon University Hospital, plasma samples were collected to establish a pre-analytical and analytical workflow of cfDNA analysis. We evaluated all of the steps from blood sampling to mutation detection output, including shipping conditions (4H versus 24H in EDTA tubes), the reproducibility of cfDNA extraction, the specificity/sensitivity of ddPCR (using external controls), and the comparison of different PCR assays for the detection of the three most important EGFR hotspots, which highlighted the increased sensitivity of our in-house primers/probes. Hence, we have described a new protocol facilitating the molecular detection of somatic mutations in cancer patients from liquid biopsies, improving their diagnosis and introducing a less traumatic monitoring system during tumor progression. PMID:29152135
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Scalable Multicast Protocols for Overlapped Groups in Broker-Based Sensor Networks
NASA Astrophysics Data System (ADS)
Kim, Chayoung; Ahn, Jinho
In sensor networks, there are lots of overlapped multicast groups because of many subscribers, associated with their potentially varying specific interests, querying every event to sensors/publishers. And gossip based communication protocols are promising as one of potential solutions providing scalability in P(Publish)/ S(Subscribe) paradigm in sensor networks. Moreover, despite the importance of both guaranteeing message delivery order and supporting overlapped multicast groups in sensor or P2P networks, there exist little research works on development of gossip-based protocols to satisfy all these requirements. In this paper, we present two versions of causally ordered delivery guaranteeing protocols for overlapped multicast groups. The one is based on sensor-broker as delegates and the other is based on local views and delegates representing subscriber subgroups. In the sensor-broker based protocol, sensor-broker might lead to make overlapped multicast networks organized by subscriber's interests. The message delivery order has been guaranteed consistently and all multicast messages are delivered to overlapped subscribers using gossip based protocols by sensor-broker. Therefore, these features of the sensor-broker based protocol might be significantly scalable rather than those of the protocols by hierarchical membership list of dedicated groups like traditional committee protocols. And the subscriber-delegate based protocol is much stronger rather than fully decentralized protocols guaranteeing causally ordered delivery based on only local views because the message delivery order has been guaranteed consistently by all corresponding members of the groups including delegates. Therefore, this feature of the subscriber-delegate protocol is a hybrid approach improving the inherent scalability of multicast nature by gossip-based technique in all communications.
Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand
2015-09-20
A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.
Willingham, D.; Brenes, D. A.; Winograd, N.; Wucher, A.
2010-01-01
Molecular depth profiles of model organic thin films were performed using a 40 keV C60+ cluster ion source in concert with TOF-SIMS. Strong-field photoionization of intact neutral molecules sputtered by 40 keV C60+ primary ions was used to analyze changes in the chemical environment of the guanine thin films as a function of ion fluence. Direct comparison of the secondary ion and neutral components of the molecular depth profiles yields valuable information about chemical damage accumulation as well as changes in the molecular ionization probability. An analytical protocol based on the erosion dynamics model is developed and evaluated using guanine and trehalose molecular secondary ion signals with and without comparable laser photoionization data. PMID:26269660
Kallem, Venkat Reddy; Pandita, Aakash; Gupta, Girish
2017-01-01
Hypoglycemia is the most common metabolic disorder encountered in neonates. The definition of hypoglycemia as well as its clinical significance and management remain controversial. Most cases of neonatal hypoglycemia are transient, respond readily to treatment, and are associated with an excellent prognosis. Persistent hypoglycemia is more likely to be associated with abnormal endocrine conditions, such as hyperinsulinemia, as well as possible neurologic sequelae. Manifestations of hypoglycemia include seizures which can result in noteworthy neuromorbidity in the long haul. Thus, hypoglycemia constitutes a neonatal emergency which requires earnest analytic assessment and prompt treatment. In this review, we have tried to cover the pathophysiology, the screening protocol for high-risk babies, management, long-term neurologic sequelae associated with neonatal hypoglycemia, with evidence-based answers wherever possible, and our own practices. PMID:29276423
Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States
NASA Astrophysics Data System (ADS)
Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas
2017-11-01
Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.
Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States.
Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas
2017-11-03
Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang
2012-02-07
Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society
Li, Xinchun; Chen, Zuanguang; Yang, Fan; Pan, Jianbin; Li, Yinbao
2013-05-01
L-3,4-dihydroxyphenylalanine (L-DOPA) is a well-recognized therapeutic compound to Parkinson's disease. Tyrosine is a precursor for the biosynthesis of L-DOPA, both of which are widely found in traditional medicinal material, Mucuna pruriens. In this paper, we described a validated novel analytical method based on microchip capillary electrophoresis with pulsed electrochemical detection for the simultaneous measurement of L-DOPA and tyrosine in M. pruriens. This protocol adopted end-channel amperometric detection using platinum disk electrode on a homemade glass/polydimethylsiloxane electrophoresis microchip. The background buffer consisted of 10 mM borate (pH 9.5) and 0.02 mM cetyltrimethylammonium bromide, which can produce an effective resolution for the two analytes. In the optimal condition, sufficient electrophoretic separation and sensitive detection for the target analytes can be realized within 60 s. Both tyrosine and L-DOPA yielded linear response in the concentration range of 5.0-400 μM (R(2) > 0.99), and the LOD were 0.79 and 1.1 μM, respectively. The accuracy and precision of the established method were favorable. The present method shows several merits such as facile apparatus, high speed, low cost and minimal pollution, and provides a means for the pharmacologically active ingredients assay in M. pruriens. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Butler, Owen; Forder, James; Saunders, John
2015-03-15
Workers in the pharmaceutical industry can potentially be exposed to airborne dusts and powders that can contain potent active pharmaceutical ingredients (API). Occupational hygienists and health and safety professionals need to assess and ultimately minimise such inhalation and dermal exposure risks. Containment of dusts at source is the first line of defence but the performance of such technologies needs to be verified, for which purpose the good practice guide: assessing the particulate containment performance of pharmaceutical equipment, produced by the International Society for Pharmaceutical Engineering (ISPE), is a widely used reference document. This guide recommends the use of surrogate powders that can be used to challenge the performance of such containment systems. Materials such as lactose and mannitol are recommended as their physical properties (adhesion, compactability, dustiness, flow characteristics and particle sizes) mimic those of API-containing materials typically handled. Furthermore they are safe materials to use, are available in high purity and can be procured at a reasonable cost. The aim of this work was to develop and validate a sensitive ion-chromatography based analytical procedure for the determination of surrogate powders collected on filter samples so as to meet analytical requirements set out in this ISPE guide. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Abi-Jaoude, Alexxa; Johnson, Andrew; Ferguson, Genevieve; Sanches, Marcos; Levinson, Andrea; Robb, Janine; Heffernan, Olivia; Herzog, Tyson; Chaim, Gloria; Cleverley, Kristin; Eysenbach, Gunther; Henderson, Joanna; S Hoch, Jeffrey; Hollenberg, Elisa; Jiang, Huan; Isaranuwatchai, Wanrudee; Law, Marcus; Sharpe, Sarah; Tripp, Tim; Voineskos, Aristotle
2016-01-01
Background Seventy percent of lifetime cases of mental illness emerge prior to age 24. While early detection and intervention can address approximately 70% of child and youth cases of mental health concerns, the majority of youth with mental health concerns do not receive the services they need. Objective The objective of this paper is to describe the protocol for optimizing and evaluating Thought Spot, a Web- and mobile-based platform cocreated with end users that is designed to improve the ability of students to access mental health and substance use services. Methods This project will be conducted in 2 distinct phases, which will aim to (1) optimize the existing Thought Spot electronic health/mobile health intervention through youth engagement, and (2) evaluate the impact of Thought Spot on self-efficacy for mental health help-seeking and health literacy among university and college students. Phase 1 will utilize participatory action research and participatory design research to cocreate and coproduce solutions with members of our target audience. Phase 2 will consist of a randomized controlled trial to test the hypothesis that the Thought Spot intervention will show improvements in intentions for, and self-efficacy in, help-seeking for mental health concerns. Results We anticipate that enhancements will include (1) user analytics and feedback mechanisms, (2) peer mentorship and/or coaching functionality, (3) crowd-sourcing and data hygiene, and (4) integration of evidence-based consumer health and research information. Conclusions This protocol outlines the important next steps in understanding the impact of the Thought Spot platform on the behavior of postsecondary, transition-aged youth students when they seek information and services related to mental health and substance use. PMID:27815232
Abbatiello, Susan E; Mani, D R; Schilling, Birgit; Maclean, Brendan; Zimmerman, Lisa J; Feng, Xingdong; Cusack, Michael P; Sedransk, Nell; Hall, Steven C; Addona, Terri; Allen, Simon; Dodder, Nathan G; Ghosh, Mousumi; Held, Jason M; Hedrick, Victoria; Inerowicz, H Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S; Riley, C Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H; Buck, Charles; Fisher, Susan J; Gibson, Bradford W; Liebler, Daniel; Maccoss, Michael; Neubert, Thomas A; Paulovich, Amanda; Regnier, Fred; Skates, Steven J; Tempst, Paul; Wang, Mu; Carr, Steven A
2013-09-01
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.
Abbatiello, Susan E.; Mani, D. R.; Schilling, Birgit; MacLean, Brendan; Zimmerman, Lisa J.; Feng, Xingdong; Cusack, Michael P.; Sedransk, Nell; Hall, Steven C.; Addona, Terri; Allen, Simon; Dodder, Nathan G.; Ghosh, Mousumi; Held, Jason M.; Hedrick, Victoria; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S.; Riley, C. Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A.; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H.; Buck, Charles; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel; MacCoss, Michael; Neubert, Thomas A.; Paulovich, Amanda; Regnier, Fred; Skates, Steven J.; Tempst, Paul; Wang, Mu; Carr, Steven A.
2013-01-01
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities. PMID:23689285
A randomized trial of protocol-based care for early septic shock.
Yealy, Donald M; Kellum, John A; Huang, David T; Barnato, Amber E; Weissfeld, Lisa A; Pike, Francis; Terndrup, Thomas; Wang, Henry E; Hou, Peter C; LoVecchio, Frank; Filbin, Michael R; Shapiro, Nathan I; Angus, Derek C
2014-05-01
In a single-center study published more than a decade ago involving patients presenting to the emergency department with severe sepsis and septic shock, mortality was markedly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy (EGDT), in which intravenous fluids, vasopressors, inotropes, and blood transfusions were adjusted to reach central hemodynamic targets, than among those receiving usual care. We conducted a trial to determine whether these findings were generalizable and whether all aspects of the protocol were necessary. In 31 emergency departments in the United States, we randomly assigned patients with septic shock to one of three groups for 6 hours of resuscitation: protocol-based EGDT; protocol-based standard therapy that did not require the placement of a central venous catheter, administration of inotropes, or blood transfusions; or usual care. The primary end point was 60-day in-hospital mortality. We tested sequentially whether protocol-based care (EGDT and standard-therapy groups combined) was superior to usual care and whether protocol-based EGDT was superior to protocol-based standard therapy. Secondary outcomes included longer-term mortality and the need for organ support. We enrolled 1341 patients, of whom 439 were randomly assigned to protocol-based EGDT, 446 to protocol-based standard therapy, and 456 to usual care. Resuscitation strategies differed significantly with respect to the monitoring of central venous pressure and oxygen and the use of intravenous fluids, vasopressors, inotropes, and blood transfusions. By 60 days, there were 92 deaths in the protocol-based EGDT group (21.0%), 81 in the protocol-based standard-therapy group (18.2%), and 86 in the usual-care group (18.9%) (relative risk with protocol-based therapy vs. usual care, 1.04; 95% confidence interval [CI], 0.82 to 1.31; P=0.83; relative risk with protocol-based EGDT vs. protocol-based standard therapy, 1.15; 95% CI, 0.88 to 1.51; P=0.31). There were no significant differences in 90-day mortality, 1-year mortality, or the need for organ support. In a multicenter trial conducted in the tertiary care setting, protocol-based resuscitation of patients in whom septic shock was diagnosed in the emergency department did not improve outcomes. (Funded by the National Institute of General Medical Sciences; ProCESS ClinicalTrials.gov number, NCT00510835.).
Go, Young-Mi; Walker, Douglas I; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A; Ziegler, Thomas R; Pennell, Kurt D; Miller, Gary W; Jones, Dean P
2015-12-01
The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Go, Young-Mi; Walker, Douglas I.; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A.; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A.; Ziegler, Thomas R.; Pennell, Kurt D.; Miller, Gary W.; Jones, Dean P.
2015-01-01
The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. PMID:26358001
Insights from Smart Meters: The Potential for Peak-Hour Savings from Behavior-Based Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Annika; Perry, Michael; Smith, Brian
The rollout of smart meters in the last several years has opened up new forms of previously unavailable energy data. Many utilities are now able in real-time to capture granular, household level interval usage data at very high-frequency levels for a large proportion of their residential and small commercial customer population. This can be linked to other time and locationspecific information, providing vast, constantly growing streams of rich data (sometimes referred to by the recently popular buzz word, “big data”). Within the energy industry there is increasing interest in tapping into the opportunities that these data can provide. What canmore » we do with all of these data? The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull interesting insights out of this highfrequency, human-focused data. We at LBNL are calling this “behavior analytics”. This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, highly disaggregated and heterogeneous information about actual energy use would allow energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; would enable evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and would provide better insights in to the energy and peak hour savings associated with specifics types of EE and DR programs (e.g., behavior-based (BB) programs). In this series, “Insights from Smart Meters”, we will present concrete, illustrative examples of the type of value that insights from behavior analytics of these data can provide (as well as pointing out its limitations). We will supply several types of key findings, including: • Novel results, which answer questions the industry previously was unable to answer; • Proof-of-concept analytics tools that can be adapted and used by others; and • Guidelines and protocols that summarize analytical best practices. This report focuses on one example of the kind of value that analysis of this data can provide: insights into whether behavior-based (BB) efficiency programs have the potential to provide peak-hour energy savings.« less
Ribes, Àngela; Santiago‐Felipe, Sara; Bernardos, Andrea; Marcos, M. Dolores; Pardo, Teresa; Sancenón, Félix; Aznar, Elena
2017-01-01
Abstract Aptamers have been used as recognition elements for several molecules due to their great affinity and selectivity. Additionally, mesoporous nanomaterials have demonstrated great potential in sensing applications. Based on these concepts, we report herein the use of two aptamer‐capped mesoporous silica materials for the selective detection of ochratoxin A (OTA). A specific aptamer for OTA was used to block the pores of rhodamine B‐loaded mesoporous silica nanoparticles. Two solids were prepared in which the aptamer capped the porous scaffolds by using a covalent or electrostatic approach. Whereas the prepared materials remained capped in water, dye delivery was selectively observed in the presence of OTA. The protocol showed excellent analytical performance in terms of sensitivity (limit of detection: 0.5–0.05 nm), reproducibility, and selectivity. Moreover, the aptasensors were tested for OTA detection in commercial foodstuff matrices, which demonstrated their potential applicability in real samples. PMID:29046860
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzler, C.L.; Poloski, J.P.; Bates, R.A.
1988-01-01
The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less
Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples
2012-01-01
Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466
Gewirtz-Meydan, Ateret; Hafford-Letchfield, Trish; Ayalon, Liat; Benyamini, Yael; Biermann, Violetta; Coffey, Alice; Jackson, Jeanne; Phelan, Amanda; Voß, Peggy; Geiger Zeman, Marija; Zeman, Zdenko
2018-06-04
This study captured older people's attitudes and concerns about sex and sexuality in later life by synthesising qualitative research published on this issue. The systematic review was conducted between November 2015 and June 2016 based on a pre-determined protocol. Key words were used to ensure a precise search strategy. Empirically based, qualitative literature from 18 databases was found. Twenty studies met the inclusion criteria. Thomas and Harden's thematic synthesis was used to generate 'analytical themes' which summarise this body of literature. Three main themes were identified: (a) social legitimacy for sexuality in later life; (b) health, not age, is what truly impacts sexuality, and (c) the hegemony of penetrative sex. The themes illustrate the complex and delicate relation between ageing and sexuality. Older adults facing health issues that affect sexual function adopt broader definitions of sexuality and sexual activity.
Reanalysis of a 15-year Archive of IMPROVE Samples
NASA Astrophysics Data System (ADS)
Hyslop, N. P.; White, W. H.; Trzepla, K.
2013-12-01
The IMPROVE (Interagency Monitoring of PROtected Visual Environments) network monitors aerosol concentrations at 170 remote sites throughout the United States. Twenty-four-hour filter samples of particulate matter are collected every third day and analyzed for chemical composition. About 30 of the sites have operated continuously since 1988, and the sustained data record (http://views.cira.colostate.edu/web/) offers a unique window on regional aerosol trends. All elemental analyses have been performed by Crocker Nuclear Laboratory at the University of California in Davis, and sample filters collected since 1995 are archived on campus. The suite of reported elements has remained constant, but the analytical methods employed for their determination have evolved. For example, the elements Na - Mn were determined by PIXE until November 2001, then by XRF analysis in a He-flushed atmosphere through 2004, and by XRF analysis in vacuum since January 2005. In addition to these fundamental changes, incompletely-documented operational factors such as detector performance and calibration details have introduced variations in the measurements. Because the past analytical methods were non-destructive, the archived filters can be re-analyzed with the current analytical systems and protocols. The 15-year sample archives from Great Smoky Mountains, Mount Rainier, and Point Reyes National Parks were selected for reanalysis. The agreement between the new analyses and original determinations varies with element and analytical era (Figure 1). Temporal trends for some elements are affected by these changes in measurement technique while others are not (Figure 2). Figure 1. Repeatability of analyses for sulfur and vanadium at Great Smoky Mountains National Park. Each point shows the ratio of mass loadings determined by the original analysis and recent reanalysis. Major method distinctions are indicated at the top. Figure 2. Trends, based on Thiel-Sen regression, in lead concentrations based on the original and reanalysis data.
de Lecuona, Itziar
2018-05-31
The current model for reviewing research with human beings basically depends on decision-making processes within research ethics committees. These committees must be aware of the importance of the new digital paradigm based on the large-scale exploitation of datasets, including personal data on health. This article offers guidelines, with the application of the EU's General Data Protection Regulation, for the appropriate evaluation of projects that are based on the use of big data analytics in healthcare. The processes for gathering and using this data constitute a niche where current research is developed. In this context, the existing protocols for obtaining informed consent from participants are outdated, as they are based not only on the assumption that personal data are anonymized, but that they will continue to be so in the future. As a result, it is essential that research ethics committees take on new capabilities and revisit values such as privacy and freedom, updating protocols, methodologies and working procedures. This change in the work culture will provide legal security to the personnel involved in research, will make it possible to guarantee the protection of the privacy of the subjects of the data, and will permit orienting the exploitation of data to avoid the commodification of personal data in this era of deidentification, so that research meets actual social needs and not spurious or opportunistic interests disguised as research. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Luo, Wei; Davis, Geoff; Li, LiXia; Shriver, M Kathleen; Mei, Joanne; Styer, Linda M; Parker, Monica M; Smith, Amanda; Paz-Bailey, Gabriela; Ethridge, Steve; Wesolowski, Laura; Owen, S Michele; Masciotra, Silvina
2017-06-01
FDA-approved antigen/antibody combo and HIV-1/2 differentiation supplemental tests do not have claims for dried blood spot (DBS) use. We compared two DBS-modified protocols, the Bio-Rad GS HIV Combo Ag/Ab (BRC) EIA and Geenius™ HIV-1/2 (Geenius) Supplemental Assay, to plasma protocols and evaluated them in the CDC/APHL HIV diagnostic algorithm. BRC-DBS p24 analytical sensitivity was calculated from serial dilutions of p24. DBS specimens included 11 HIV-1 seroconverters, 151 HIV-1-positive individuals, including 20 on antiretroviral therapy, 31 HIV-2-positive and one HIV-1/HIV-2-positive individuals. BRC-reactive specimens were tested with Geenius using the same DBS eluate. Matched plasma specimens were tested with BRC, an IgG/IgM immunoassay and Geenius. DBS and plasma results were compared using the McNemar's test. A DBS-algorithm applied to 348 DBS from high-risk individuals who participated in surveillance was compared to HIV status based on local testing algorithms. BRC-DBS detects p24 at a concentration 18 times higher than in plasma. In seroconverters, BRC-DBS detected more infections than the IgG/IgM immunoassay in plasma (p=0.0133), but fewer infections than BRC-plasma (p=0.0133). In addition, the BRC/Geenius-plasma algorithm identified more HIV-1 infections than the BRC/Geenius-DBS algorithm (p=0.0455). The DBS protocols correctly identified HIV status for established HIV-1 infections, including those on therapy, HIV-2 infections, and surveillance specimens. The DBS protocols exhibited promising performance and allowed rapid supplemental testing. Although the DBS algorithm missed some early infections, it showed similar results when applied to specimens from a high-risk population. Implementation of a DBS algorithm would benefit testing programs without capacity for venipuncture. Published by Elsevier B.V.
Shiwa, Yuh; Hachiya, Tsuyoshi; Furukawa, Ryohei; Ohmomo, Hideki; Ono, Kanako; Kudo, Hisaaki; Hata, Jun; Hozawa, Atsushi; Iwasaki, Motoki; Matsuda, Koichi; Minegishi, Naoko; Satoh, Mamoru; Tanno, Kozo; Yamaji, Taiki; Wakai, Kenji; Hitomi, Jiro; Kiyohara, Yutaka; Kubo, Michiaki; Tanaka, Hideo; Tsugane, Shoichiro; Yamamoto, Masayuki; Sobue, Kenji; Shimizu, Atsushi
2016-01-01
Differences in DNA collection protocols may be a potential confounder in epigenome-wide association studies (EWAS) using a large number of blood specimens from multiple biobanks and/or cohorts. Here we show that pre-analytical procedures involved in DNA collection can induce systematic bias in the DNA methylation profiles of blood cells that can be adjusted by cell-type composition variables. In Experiment 1, whole blood from 16 volunteers was collected to examine the effect of a 24 h storage period at 4°C on DNA methylation profiles as measured using the Infinium HumanMethylation450 BeadChip array. Our statistical analysis showed that the P-value distribution of more than 450,000 CpG sites was similar to the theoretical distribution (in quantile-quantile plot, λ = 1.03) when comparing two control replicates, which was remarkably deviated from the theoretical distribution (λ = 1.50) when comparing control and storage conditions. We then considered cell-type composition as a possible cause of the observed bias in DNA methylation profiles and found that the bias associated with the cold storage condition was largely decreased (λadjusted = 1.14) by taking into account a cell-type composition variable. As such, we compared four respective sample collection protocols used in large-scale Japanese biobanks or cohorts as well as two control replicates. Systematic biases in DNA methylation profiles were observed between control and three of four protocols without adjustment of cell-type composition (λ = 1.12–1.45) and no remarkable biases were seen after adjusting for cell-type composition in all four protocols (λadjusted = 1.00–1.17). These results revealed important implications for comparing DNA methylation profiles between blood specimens from different sources and may lead to discovery of disease-associated DNA methylation markers and the development of DNA methylation profile-based predictive risk models. PMID:26799745
Shiwa, Yuh; Hachiya, Tsuyoshi; Furukawa, Ryohei; Ohmomo, Hideki; Ono, Kanako; Kudo, Hisaaki; Hata, Jun; Hozawa, Atsushi; Iwasaki, Motoki; Matsuda, Koichi; Minegishi, Naoko; Satoh, Mamoru; Tanno, Kozo; Yamaji, Taiki; Wakai, Kenji; Hitomi, Jiro; Kiyohara, Yutaka; Kubo, Michiaki; Tanaka, Hideo; Tsugane, Shoichiro; Yamamoto, Masayuki; Sobue, Kenji; Shimizu, Atsushi
2016-01-01
Differences in DNA collection protocols may be a potential confounder in epigenome-wide association studies (EWAS) using a large number of blood specimens from multiple biobanks and/or cohorts. Here we show that pre-analytical procedures involved in DNA collection can induce systematic bias in the DNA methylation profiles of blood cells that can be adjusted by cell-type composition variables. In Experiment 1, whole blood from 16 volunteers was collected to examine the effect of a 24 h storage period at 4°C on DNA methylation profiles as measured using the Infinium HumanMethylation450 BeadChip array. Our statistical analysis showed that the P-value distribution of more than 450,000 CpG sites was similar to the theoretical distribution (in quantile-quantile plot, λ = 1.03) when comparing two control replicates, which was remarkably deviated from the theoretical distribution (λ = 1.50) when comparing control and storage conditions. We then considered cell-type composition as a possible cause of the observed bias in DNA methylation profiles and found that the bias associated with the cold storage condition was largely decreased (λ adjusted = 1.14) by taking into account a cell-type composition variable. As such, we compared four respective sample collection protocols used in large-scale Japanese biobanks or cohorts as well as two control replicates. Systematic biases in DNA methylation profiles were observed between control and three of four protocols without adjustment of cell-type composition (λ = 1.12-1.45) and no remarkable biases were seen after adjusting for cell-type composition in all four protocols (λ adjusted = 1.00-1.17). These results revealed important implications for comparing DNA methylation profiles between blood specimens from different sources and may lead to discovery of disease-associated DNA methylation markers and the development of DNA methylation profile-based predictive risk models.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2014-01-01
Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.
MacGrogan, Gaëtan; Mathieu, Marie-Christine; Poulet, Bruno; Penault-Llorca, Frédérique; Vincent-Salomon, Anne; Roger, Pascal; Treilleux, Isabelle; Valent, Alexander; Antoine, Martine; Becette, Véronique; Bor, Catherine; Brabencova, Eva; Charafe-Jauffret, Emmanuelle; Chenard, Marie-Pierre; Dauplat, Marie-Mélanie; Delrée, Paul; Devouassoux, Mojgan; Fiche, Maryse; Fondrevelle, Marie-Eve; Fridman, Viviana; Garbar, Christian; Genin, Pascal; Ghnassia, Jean-Pierre; Haudebourg, Juliette; Laberge-Le Couteulx, Sophie; Loussouarn, Delphine; Maran-Gonzalez, Aurélie; Marcy, Myriam; Michenet, Patrick; Sagan, Christine; Trassard, Martine; Verriele, Véronique; Arnould, Laurent; Lacroix-Triki, Magali
2014-10-01
Biomarker assessment of breast cancer tumor samples is part of the routine workflow of pathology laboratories. International guidelines have recently been updated, with special regards to the pre-analytical steps that are critical for the quality of immunohistochemical and in situ hybridization procedures, whatever the biomarker analyzed. Fixation and specimen handling protocols must be standardized, validated and carefully tracked. Cooperation and training of the personnel involved in the specimen workflow (e.g. radiologists, surgeons, nurses, technicians and pathologists) are of paramount importance. The GEFPICS' update of the recommendations herein details and comments the different steps of the pre-analytical process. Application of these guidelines and participation to quality insurance programs are mandatory to ensure the correct evaluation of oncotheranostic biomarkers. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian
2016-09-01
Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity) yields the following equation: M1(urban data) : ECIMP_TOR = 2.20 × ECNSH_TOT - 0.05. While both M1 and M2 are acceptable, M2 with site-specific parameters provides the best reconstruction performance. Secondary OC (SOC) estimation using OC and EC by the two protocols is compared. An analysis of the usability of reconstructed ECIMP_TOR and OCIMP_TOR suggests that the reconstructed values are not suitable for SOC estimation due to the poor reconstruction of the OC / EC ratio.
A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks
Lin, Lin; Ma, Shiwei; Ma, Maode
2014-01-01
Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163
SYTO probes: markers of apoptotic cell demise.
Wlodkowic, Donald; Skommer, Joanna
2007-10-01
As mechanistic studies on tumor cell death advance towards their ultimate translational goal, there is a need for specific, rapid, and high-throughput analytical tools to detect diverse cell demise modes. Patented DNA-binding SYTO probes, for example, are gaining increasing interest as easy-to-use markers of caspase-dependent apoptotic cell death. They are proving convenient for tracking apoptosis in diverse hematopoietic cell lines and primary tumor samples, and, due to their spectral characteristics, appear to be useful for the development of multiparameter flow cytometry assays. Herein, several protocols for multiparametric assessment of apoptotic events using SYTO probes are provided. There are protocols describing the use of green fluorescent SYTO 16 and red fluorescent SYTO 17 dyes in combination with plasma membrane permeability markers. Another protocol highlights the multiparametric use of SYTO 16 dye in conjunction with the mitochondrial membrane potential sensitive probe, tetramethylrhodamine methyl ester (TMRM), and the plasma membrane permeability marker, 7-aminoactinomycin D (7-AAD).
A new paper-based platform technology for point-of-care diagnostics.
Gerbers, Roman; Foellscher, Wilke; Chen, Hong; Anagnostopoulos, Constantine; Faghri, Mohammad
2014-10-21
Currently, the Lateral flow Immunoassays (LFIAs) are not able to perform complex multi-step immunodetection tests because of their inability to introduce multiple reagents in a controlled manner to the detection area autonomously. In this research, a point-of-care (POC) paper-based lateral flow immunosensor was developed incorporating a novel microfluidic valve technology. Layers of paper and tape were used to create a three-dimensional structure to form the fluidic network. Unlike the existing LFIAs, multiple directional valves are embedded in the test strip layers to control the order and the timing of mixing for the sample and multiple reagents. In this paper, we report a four-valve device which autonomously directs three different fluids to flow sequentially over the detection area. As proof of concept, a three-step alkaline phosphatase based Enzyme-Linked ImmunoSorbent Assay (ELISA) protocol with Rabbit IgG as the model analyte was conducted to prove the suitability of the device for immunoassays. The detection limit of about 4.8 fm was obtained.
Imaging, Health Record, and Artificial Intelligence: Hype or Hope?
Mazzanti, Marco; Shirka, Ervina; Gjergo, Hortensia; Hasimi, Endri
2018-05-10
The review is focused on "digital health", which means advanced analytics based on multi-modal data. The "Health Care Internet of Things", which uses sensors, apps, and remote monitoring could provide continuous clinical information in the cloud that enables clinicians to access the information they need to care for patients everywhere. Greater standardization of acquisition protocols will be needed to maximize the potential gains from automation and machine learning. Recent artificial intelligence applications on cardiac imaging will not be diagnosing patients and replacing doctors but will be augmenting their ability to find key relevant data they need to care for a patient and present it in a concise, easily digestible format. Risk stratification will transition from oversimplified population-based risk scores to machine learning-based metrics incorporating a large number of patient-specific clinical and imaging variables in real-time beyond the limits of human cognition. This will deliver highly accurate and individual personalized risk assessments and facilitate tailored management plans.
Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.
Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar
2016-05-01
Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits can be anticipated for the assessment of safety and toxicity profile of novel molecules. © The Author 2016. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Enhancing Time Synchronization Support in Wireless Sensor Networks
Tavares Bruscato, Leandro; Heimfarth, Tales; Pignaton de Freitas, Edison
2017-01-01
With the emerging Internet of Things (IoT) technology becoming reality, a number of applications are being proposed. Several of these applications are highly dependent on wireless sensor networks (WSN) to acquire data from the surrounding environment. In order to be really useful for most of applications, the acquired data must be coherent in terms of the time in which they are acquired, which implies that the entire sensor network presents a certain level of time synchronization. Moreover, to efficiently exchange and forward data, many communication protocols used in WSN rely also on time synchronization among the sensor nodes. Observing the importance in complying with this need for time synchronization, this work focuses on the second synchronization problem, proposing, implementing and testing a time synchronization service for low-power WSN using low frequency real-time clocks in each node. To implement this service, three algorithms based on different strategies are proposed: one based on an auto-correction approach, the second based on a prediction mechanism, while the third uses an analytical correction mechanism. Their goal is the same, i.e., to make the clocks of the sensor nodes converge as quickly as possible and then to keep them most similar as possible. This goal comes along with the requirement to keep low energy consumption. Differently from other works in the literature, the proposal here is independent of any specific protocol, i.e., it may be adapted to be used in different protocols. Moreover, it explores the minimum number of synchronization messages by means of a smart clock update strategy, allowing the trade-off between the desired level of synchronization and the associated energy consumption. Experimental results, which includes data acquired from simulations and testbed deployments, provide evidence of the success in meeting this goal, as well as providing means to compare these three approaches considering the best synchronization results and their costs in terms of energy consumption. PMID:29261113
Enhancing Time Synchronization Support in Wireless Sensor Networks.
Tavares Bruscato, Leandro; Heimfarth, Tales; Pignaton de Freitas, Edison
2017-12-20
With the emerging Internet of Things (IoT) technology becoming reality, a number of applications are being proposed. Several of these applications are highly dependent on wireless sensor networks (WSN) to acquire data from the surrounding environment. In order to be really useful for most of applications, the acquired data must be coherent in terms of the time in which they are acquired, which implies that the entire sensor network presents a certain level of time synchronization. Moreover, to efficiently exchange and forward data, many communication protocols used in WSN rely also on time synchronization among the sensor nodes. Observing the importance in complying with this need for time synchronization, this work focuses on the second synchronization problem, proposing, implementing and testing a time synchronization service for low-power WSN using low frequency real-time clocks in each node. To implement this service, three algorithms based on different strategies are proposed: one based on an auto-correction approach, the second based on a prediction mechanism, while the third uses an analytical correction mechanism. Their goal is the same, i.e., to make the clocks of the sensor nodes converge as quickly as possible and then to keep them most similar as possible. This goal comes along with the requirement to keep low energy consumption. Differently from other works in the literature, the proposal here is independent of any specific protocol, i.e., it may be adapted to be used in different protocols. Moreover, it explores the minimum number of synchronization messages by means of a smart clock update strategy, allowing the trade-off between the desired level of synchronization and the associated energy consumption. Experimental results, which includes data acquired from simulations and testbed deployments, provide evidence of the success in meeting this goal, as well as providing means to compare these three approaches considering the best synchronization results and their costs in terms of energy consumption.
ERIC Educational Resources Information Center
Kristian, Kathleen E.; Friedbauer, Scott; Kabashi, Donika; Ferencz, Kristen M.; Barajas, Jennifer C.; O'Brien, Kelly
2015-01-01
Analysis of mercury in fish is an interesting problem with the potential to motivate students in chemistry laboratory courses. The recommended method for mercury analysis in fish is cold vapor atomic absorption spectroscopy (CVAAS), which requires homogeneous analyte solutions, typically prepared by acid digestion. Previously published digestion…
ERIC Educational Resources Information Center
Lavoie, Jean-Michel; Chornet, Esteban; Pelletier, Andre
2008-01-01
This experiment targets undergraduate students in an analytical or organic instructional context. Using a simple extraction, this protocol allows students to quantify and qualify monoterpenes in essential oils from citrus fruit peels. The procedures involve cooling down the peels by immersing them into icy water. After a few minutes, the chilled…
ERIC Educational Resources Information Center
Neebe, Diana Combs
2017-01-01
Learning by example is nothing new to the education landscape. Research into think-aloud protocols, though often used as a form of assessment rather than instruction, provided practical, content-specific literacy strategies for crafting the instructional intervention in this study. Additionally, research into worked examples--from the earliest…
This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...
Chromatin immunoprecipitation in microfluidic droplets: towards fast and cheap analyses.
Teste, Bruno; Champ, Jerome; Londono-Vallejo, Arturo; Descroix, Stéphanie; Malaquin, Laurent; Viovy, Jean-Louis; Draskovic, Irena; Mottet, Guillaume
2017-01-31
Genetic organization is governed by the interaction of DNA with histone proteins, and differential modifications of these proteins is a fundamental mechanism of gene regulation. Histone modifications are primarily studied through chromatin immunoprecipitation (ChIP) assays, however conventional ChIP procedures are time consuming, laborious and require a large number of cells. Here we report for the first time the development of ChIP in droplets based on a microfluidic platform combining nanoliter droplets, magnetic beads (MB) and magnetic tweezers (MT). The droplet approach enabled compartmentalization and improved mixing, while reducing the consumption of samples and reagents in an integrated workflow. Anti-histone antibodies grafted to MB were used as a solid support to capture and transfer the target chromatin from droplets to droplets in order to perform chromatin immunoprecipitation, washing, elution and purification of DNA. We designed a new ChIP protocol to investigate four different types of modified histones with known roles in gene activation or repression. We evaluated the performances of this new ChIP in droplet assay in comparison with conventional methods. The proposed technology dramatically reduces analytical time from a few days to 7 hours, simplifies the ChIP protocol and decreases the number of cells required by 100 fold while maintaining a high degree of sensitivity and specificity. Therefore this droplet-based ChIP assay represents a new, highly advantageous and convenient approach to epigenetic analyses.
Distributed Wireless Power Transfer With Energy Feedback
NASA Astrophysics Data System (ADS)
Lee, Seunghyun; Zhang, Rui
2017-04-01
Energy beamforming (EB) is a key technique for achieving efficient radio-frequency (RF) transmission enabled wireless energy transfer (WET). By optimally designing the waveforms from multiple energy transmitters (ETs) over the wireless channels, they can be constructively combined at the energy receiver (ER) to achieve an EB gain that scales with the number of ETs. However, the optimal design of EB waveforms requires accurate channel state information (CSI) at the ETs, which is challenging to obtain practically, especially in a distributed system with ETs at separate locations. In this paper, we study practical and efficient channel training methods to achieve optimal EB in a distributed WET system. We propose two protocols with and without centralized coordination, respectively, where distributed ETs either sequentially or in parallel adapt their transmit phases based on a low-complexity energy feedback from the ER. The energy feedback only depends on the received power level at the ER, where each feedback indicates one particular transmit phase that results in the maximum harvested power over a set of previously used phases. Simulation results show that the two proposed training protocols converge very fast in practical WET systems even with a large number of distributed ETs, while the protocol with sequential ET phase adaptation is also analytically shown to converge to the optimal EB design with perfect CSI by increasing the training time. Numerical results are also provided to evaluate the performance of the proposed distributed EB and training designs as compared to other benchmark schemes.
Optimization of oncological {sup 18}F-FDG PET/CT imaging based on a multiparameter analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menezes, Vinicius O., E-mail: vinicius@radtec.com.br; Machado, Marcos A. D.; Queiroz, Cleiton C.
2016-02-15
Purpose: This paper describes a method to achieve consistent clinical image quality in {sup 18}F-FDG scans accounting for patient habitus, dose regimen, image acquisition, and processing techniques. Methods: Oncological PET/CT scan data for 58 subjects were evaluated retrospectively to derive analytical curves that predict image quality. Patient noise equivalent count rate and coefficient of variation (CV) were used as metrics in their analysis. Optimized acquisition protocols were identified and prospectively applied to 179 subjects. Results: The adoption of different schemes for three body mass ranges (<60 kg, 60–90 kg, >90 kg) allows improved image quality with both point spread functionmore » and ordered-subsets expectation maximization-3D reconstruction methods. The application of this methodology showed that CV improved significantly (p < 0.0001) in clinical practice. Conclusions: Consistent oncological PET/CT image quality on a high-performance scanner was achieved from an analysis of the relations existing between dose regimen, patient habitus, acquisition, and processing techniques. The proposed methodology may be used by PET/CT centers to develop protocols to standardize PET/CT imaging procedures and achieve better patient management and cost-effective operations.« less
Polidori, G; Marreiro, A; Pron, H; Lestriez, P; Boyer, F C; Quinart, H; Tourbah, A; Taïar, R
2016-11-01
This article establishes the basics of a theoretical model for the constitutive law that describes the skin temperature and thermolysis heat losses undergone by a subject during a session of whole-body cryotherapy (WBC). This study focuses on the few minutes during which the human body is subjected to a thermal shock. The relationship between skin temperature and thermolysis heat losses during this period is still unknown and have not yet been studied in the context of the whole human body. The analytical approach here is based on the hypothesis that the skin thermal shock during a WBC session can be thermally modelled by the sum of both radiative and free convective heat transfer functions. The validation of this scientific approach and the derivation of temporal evolution thermal laws, both on skin temperature and dissipated thermal power during the thermal shock open many avenues of large scale studies with the aim of proposing individualized cryotherapy protocols as well as protocols intended for target populations. Furthermore, this study shows quantitatively the substantial imbalance between human metabolism and thermolysis during WBC, the explanation of which remains an open question. Copyright © 2016 Elsevier Ltd. All rights reserved.
Unice, Kenneth M.; Kreider, Marisa L.; Panko, Julie M.
2012-01-01
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories. PMID:23202830
Warren, Amy L; Donnon, Tyrone L; Wagg, Catherine R; Priest, Heather; Fernandez, Nicole J
2018-01-18
Visual diagnostic reasoning is the cognitive process by which pathologists reach a diagnosis based on visual stimuli (cytologic, histopathologic, or gross imagery). Currently, there is little to no literature examining visual reasoning in veterinary pathology. The objective of the study was to use eye tracking to establish baseline quantitative and qualitative differences between the visual reasoning processes of novice and expert veterinary pathologists viewing cytology specimens. Novice and expert participants were each shown 10 cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently verbalizing their thought processes using the think-aloud protocol (5 slides). Compared to novices, experts demonstrated significantly higher diagnostic accuracy (p<.017), shorter time to diagnosis (p<.017), and a higher percentage of time spent viewing areas of diagnostic interest (p<.017). Experts elicited more key diagnostic features in the think-aloud protocol and had more efficient patterns of eye movement. These findings suggest that experts' fast time to diagnosis, efficient eye-movement patterns, and preference for viewing areas of interest supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis.
Quality assurance and quality control of geochemical data—A primer for the research scientist
Geboy, Nicholas J.; Engle, Mark A.
2011-01-01
Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.
Conflict monitoring in dual process theories of thinking.
De Neys, Wim; Glumicic, Tamara
2008-03-01
Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and description did not conflict this was not the case. Results suggest that whereas the popular characterization of conflict detection as an actively experienced struggle can be questioned there is nevertheless evidence for Sloman's and Epstein's basic claim about the flawless operation of the monitoring. Whenever the base rates and description disagree people will detect this conflict and consequently redirect attention towards a deeper processing of the base rates. Implications for the dual process framework and the rationality debate are discussed.
Filter Membrane Effects on Water-Extractable Phosphorus Concentrations from Soil.
Norby, Jessica; Strawn, Daniel; Brooks, Erin
2018-03-01
To accurately assess P concentrations in soil extracts, standard laboratory practices for monitoring P concentrations are needed. Water-extractable P is a common analytical test to determine P availability for leaching from soils, and it is used to determine best management practices. Most P analytical tests require filtration through a filter membrane with 0.45-μm pore size to distinguish between particulate and dissolved P species. However, filter membrane type is rarely specified in method protocols, and many different types of membranes are available. In this study, three common filter membrane materials (polyether sulfone, nylon, and nitrocellulose), all with 0.45-μm pore sizes, were tested for analytical differences in total P concentrations and dissolved reactive P (DRP) concentrations in water extracts from six soils sampled from two regions. Three of the extracts from the six soil samples had different total P concentrations for all three membrane types. The other three soil extracts had significantly different total P results from at least one filter membrane type. Total P concentration differences were as great as 35%. The DRP concentrations in the extracts were dependent on filter type in five of the six soil types. Results from this research show that filter membrane type is an important parameter that affects concentrations of total P and DRP from soil extracts. Thus, membrane type should be specified in soil extraction protocols. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Rethinking the NTCIP Design and Protocols - Analyzing the Issues
DOT National Transportation Integrated Search
1998-03-03
This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...
NASA Astrophysics Data System (ADS)
Vogel, Matthias; Thomas, Andreas; Schänzer, Wilhelm; Thevis, Mario
2015-09-01
The development of a new class of erythropoietin mimetic agents (EMA) for treating anemic conditions has been initiated with the discovery of oligopeptides capable of dimerizing the erythropoietin (EPO) receptor and thus stimulating erythropoiesis. The most promising amino acid sequences have been mounted on various different polymeric structures or carrier molecules to obtain highly active EPO-like drugs exhibiting beneficial and desirable pharmacokinetic profiles. Concomitant with creating new therapeutic options, erythropoietin mimetic peptide (EMP)-based drug candidates represent means to artificially enhance endurance performance and necessitate coverage by sports drug testing methods. Therefore, the aim of the present study was to develop a strategy for the comprehensive detection of EMPs in doping controls, which can be used complementary to existing protocols. Three model EMPs were used to provide proof-of-concept data. Following EPO receptor-facilitated purification of target analytes from human urine, the common presence of the cysteine-flanked core structure of EMPs was exploited to generate diagnostic peptides with the aid of a nonenzymatic cleavage procedure. Sensitive detection was accomplished by targeted-SIM/data-dependent MS2 analysis. Method characterization was conducted for the EMP-based drug peginesatide concerning specificity, linearity, precision, recovery, stability, ion suppression/enhancement, and limit of detection (LOD, 0.25 ng/mL). Additionally, first data for the identification of the erythropoietin mimetic peptides EMP1 and BB68 were generated, demonstrating the multi-analyte testing capability of the presented approach.
Sriram, Vinay K; Montgomery, Doug
2017-07-01
The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.
Khani, Rouhollah; Ghasemi, Jahan B; Shemirani, Farzaneh
2014-10-01
This research reports the first application of β-cyclodextrin (β-CD) complexes as a new method for generation of three way data, combined with second-order calibration methods for quantification of a binary mixture of caffeic (CA) and vanillic (VA) acids, as model compounds in fruit juices samples. At first, the basic experimental parameters affecting the formation of inclusion complexes between target analytes and β-CD were investigated and optimized. Then under the optimum conditions, parallel factor analysis (PARAFAC) and bilinear least squares/residual bilinearization (BLLS/RBL) were applied for deconvolution of trilinear data to get spectral and concentration profiles of CA and VA as a function of β-CD concentrations. Due to severe concentration profile overlapping between CA and VA in β-CD concentration dimension, PARAFAC could not be successfully applied to the studied samples. So, BLLS/RBL performed better than PARAFAC. The resolution of the model compounds was possible due to differences in the spectral absorbance changes of the β-CD complexes signals of the investigated analytes, opening a new approach for second-order data generation. The proposed method was validated by comparison with a reference method based on high-performance liquid chromatography photodiode array detection (HPLC-PDA), and no significant differences were found between the reference values and the ones obtained with the proposed method. Such a chemometrics-based protocol may be a very promising tool for more analytical applications in real samples monitoring, due to its advantages of simplicity, rapidity, accuracy, sufficient spectral resolution and concentration prediction even in the presence of unknown interferents. Copyright © 2014 Elsevier B.V. All rights reserved.
Ripoll Trujillo, Noelia; Martínez Sánchez, Lidia; Habimana Jordana, Anna; Trenchs Sainz de La Maza, Victoria; Vila Miravet, Víctor; Luaces Cubells, Carles
2018-04-14
The ingestion of a caustic agent is the most common cause of admission after being in contact with a domestic product. A group of patients could be considered low risk and not require aggressive procedures such a corticosteroid administration and endoscopy, especially in the paediatric population. To evaluate the safety and benefit of a less aggressive protocol for patients defined as low risk. An analytical-observational study conducted on patients who consulted for caustic ingestion between January 2011 and December 2015. Two periods were differentiated according to the current protocol. Period-1: usual protocol (which included admission and parenteral corticosteroid and antibiotic administration) and Period-2: less aggressive protocol for the low risk patients (oral intake test after 6hours and discharged if they remained asymptomatic). Low risk patients were considered as those who met the following criteria: unintentional intake, absence of symptoms and oral lesions. In the rest of the patients the usual protocol was performed. Re-admission with a diagnosis of digestive lesions was considered as a complication. Forty-eight patients were included in period 1, and 35 in period 2. In period 2, thirteen patients met low risk criteria. The adherence to the less aggressive protocol was 100%. None of the low risk patients required admission or endoscopy after discharge. In period 1 the adherence to the usual protocol was 60.4%. Six patients would have benefited from the application of the less aggressive protocol. Adopting a more conservative attitude in low risk patients is safe. These patients benefit from clinical observation, without performing more aggressive measures with their possible iatrogenic adverse effects. Copyright © 2018. Publicado por Elsevier España, S.L.U.
Pazo, Daniel Y.; Moliere, Fallon; Sampson, Maureen M.; Reese, Christopher M.; Agnew-Heard, Kimberly A.; Walters, Matthew J.; Holman, Matthew R.; Blount, Benjamin C.; Watson, Clifford; Chambers, David M.
2017-01-01
Introduction A significant portion of the increased risk of cancer and respiratory disease from exposure to cigarette smoke is attributed to volatile organic compounds (VOCs). In this study, 21 VOCs were quantified in mainstream cigarette smoke from 50 U.S. domestic brand varieties that included high market share brands and two Kentucky research cigarettes (3R4F and 1R5F). Methods Mainstream smoke was generated under ISO 3308 and Canadian Intense (CI) smoking protocols with linear smoking machines with a gas sampling bag collection followed by SPME/GC/MS analysis. Results For both protocols, mainstream smoke VOC amounts among the different brand varieties were strongly correlated between the majority of the analytes. Overall, Pearson correlation (r) ranged from 0.68 to 0.99 for ISO and 0.36 to 0.95 for CI. However, monoaromatic compounds were found to increase disproportionately compared to unsaturated, nitro, and carbonyl compounds under the CI smoking protocol where filter ventilation is blocked. Conclusions Overall, machine generated “vapor phase” amounts (μg/cigarette) are primarily attributed to smoking protocol (e.g., blocking of vent holes, puff volume, and puff duration) and filter ventilation. A possible cause for the disproportionate increase in monoaromatic compounds could be increased pyrolysis under low oxygen conditions associated with the CI protocol. PMID:27113015
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Immunoelectron microscopy in embryos.
Sierralta, W D
2001-05-01
Immunogold labeling of proteins in sections of embryos embedded in acrylate media provides an important analytical tool when the resolving power of the electron microscope is required to define sites of protein function. The protocol presented here was established to analyze the role and dynamics of the activated protein kinase C/Rack1 regulatory system in the patterning and outgrowth of limb bud mesenchyme. With minor changes, especially in the composition of the fixative solution, the protocol should be easily adaptable for the postembedding immunogold labeling of any other antigen in tissues of embryos of diverse species. Quantification of the labeling can be achieved by using electron microscope systems capable of supporting digital image analysis. Copyright 2001 Academic Press.
Coherent population trapping with a controlled dissipation: applications in optical metrology
NASA Astrophysics Data System (ADS)
Nicolas, L.; Delord, T.; Jamonneau, P.; Coto, R.; Maze, J.; Jacques, V.; Hétet, G.
2018-03-01
We analyze the properties of a pulsed coherent population trapping protocol that uses a controlled decay from the excited state in a Λ-level scheme. We study this problem analytically and numerically and find regimes where narrow transmission, absorption, or fluorescence spectral lines occur. We then look for optimal frequency measurements using these spectral features by computing the Allan deviation in the presence of ground state decoherence and show that the protocol is on a par with Ramsey-CPT. We discuss possible implementations with ensembles of alkali atoms and single ions and demonstrate that typical pulsed-CPT experiments that are realized on femto-second timescales can be implemented on micro-seconds timescales using this scheme.
SPP: A data base processor data communications protocol
NASA Technical Reports Server (NTRS)
Fishwick, P. A.
1983-01-01
The design and implementation of a data communications protocol for the Intel Data Base Processor (DBP) is defined. The protocol is termed SPP (Service Port Protocol) since it enables data transfer between the host computer and the DBP service port. The protocol implementation is extensible in that it is explicitly layered and the protocol functionality is hierarchically organized. Extensive trace and performance capabilities have been supplied with the protocol software to permit optional efficient monitoring of the data transfer between the host and the Intel data base processor. Machine independence was considered to be an important attribute during the design and implementation of SPP. The protocol source is fully commented and is included in Appendix A of this report.
The NIST Quantitative Infrared Database
Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.
1999-01-01
With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.
NASA Astrophysics Data System (ADS)
Barnsley, Lester C.; Carugo, Dario; Aron, Miles; Stride, Eleanor
2017-03-01
The aim of this study was to characterize the behaviour of superparamagnetic particles in magnetic drug targeting (MDT) schemes. A 3-dimensional mathematical model was developed, based on the analytical derivation of the trajectory of a magnetized particle suspended inside a fluid channel carrying laminar flow and in the vicinity of an external source of magnetic force. Semi-analytical expressions to quantify the proportion of captured particles, and their relative accumulation (concentration) as a function of distance along the wall of the channel were also derived. These were expressed in terms of a non-dimensional ratio of the relevant physical and physiological parameters corresponding to a given MDT protocol. The ability of the analytical model to assess magnetic targeting schemes was tested against numerical simulations of particle trajectories. The semi-analytical expressions were found to provide good first-order approximations for the performance of MDT systems in which the magnetic force is relatively constant over a large spatial range. The numerical model was then used to test the suitability of a range of different designs of permanent magnet assemblies for MDT. The results indicated that magnetic arrays that emit a strong magnetic force that varies rapidly over a confined spatial range are the most suitable for concentrating magnetic particles in a localized region. By comparison, commonly used magnet geometries such as button magnets and linear Halbach arrays result in distributions of accumulated particles that are less efficient for delivery. The trajectories predicted by the numerical model were verified experimentally by acoustically focusing magnetic microbeads flowing in a glass capillary channel, and optically tracking their path past a high field gradient Halbach array.
Analytical methods for the determination of personal care products in human samples: an overview.
Jiménez-Díaz, I; Zafra-Gómez, A; Ballesteros, O; Navalón, A
2014-11-01
Personal care products (PCPs) are organic chemicals widely used in everyday human life. Nowadays, preservatives, UV-filters, antimicrobials and musk fragrances are widely used PCPs. Different studies have shown that some of these compounds can cause adverse health effects, such as genotoxicity, which could even lead to mutagenic or carcinogenic effects, or estrogenicity because of their endocrine disruption activity. Due to the absence of official monitoring protocols, there is an increasing demand of analytical methods that allow the determination of those compounds in human samples in order to obtain more information regarding their behavior and fate in the human body. The complexity of the biological matrices and the low concentration levels of these compounds make necessary the use of advanced sample treatment procedures that afford both, sample clean-up, to remove potentially interfering matrix components, as well as the concentration of analytes. In the present work, a review of the more recent analytical methods published in the scientific literature for the determination of PCPs in human fluids and tissue samples, is presented. The work focused on sample preparation and the analytical techniques employed. Copyright © 2014 Elsevier B.V. All rights reserved.
Turner, D; Levine, A; Weiss, B; Hirsh, A; Shamir, R; Shaoul, R; Berkowitz, D; Bujanover, Y; Cohen, S; Eshach-Adiv, O; Jamal, Gera; Kori, M; Lerner, A; On, A; Rachman, L; Rosenbach, Y; Shamaly, H; Shteyer, E; Silbermintz, A; Yerushalmi, B
2010-12-01
There are no current recommendations for bowel cleansing before colonoscopy in children. The Israeli Society of Pediatric Gastroenterology and Nutrition (ISPGAN) established an iterative working group to formulate evidence-based guidelines for bowel cleansing in children prior to colonoscopy. Data were collected by systematic review of the literature and via a national-based survey of all endoscopy units in Israel. Based on the strength of evidence, the Committee reached consensus on six recommended protocols in children. Guidelines were finalized after an open audit of ISPGAN members. Data on 900 colonoscopies per year were accrued, which represents all annual pediatric colonoscopies performed in Israel. Based on the literature review, the national survey, and the open audit, several age-stratified pediatric cleansing protocols were proposed: two PEG-ELS protocols (polyethylene-glycol with electrolyte solution); Picolax-based protocol (sodium picosulphate with magnesium citrate); sodium phosphate protocol (only in children over the age of 12 years who are at low risk for renal damage); stimulant laxative-based protocol (e. g. bisacodyl); and a PEG 3350-based protocol. A population-based analysis estimated that the acute toxicity rate of oral sodium phosphate is at most 3/7320 colonoscopies (0.041 %). Recommendations on diet and enema use are provided in relation to each proposed protocol. There is no ideal bowel cleansing regimen and, thus, various protocols are in use. We propose several evidence-based protocols to optimize bowel cleansing in children prior to colonoscopy and minimize adverse events. © Georg Thieme Verlag KG Stuttgart · New York.
DIGE Analysis of Human Tissues.
Gelfi, Cecilia; Capitanio, Daniele
2018-01-01
Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.
Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo
2017-03-07
Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.
Efficiency and security problems of anonymous key agreement protocol based on chaotic maps
NASA Astrophysics Data System (ADS)
Yoon, Eun-Jun
2012-07-01
In 2011, Niu-Wang proposed an anonymous key agreement protocol based on chaotic maps in [Niu Y, Wang X. An anonymous key agreement protocol based on chaotic maps. Commun Nonlinear Sci Simulat 2011;16(4):1986-92]. Niu-Wang's protocol not only achieves session key agreement between a server and a user, but also allows the user to anonymously interact with the server. Nevertheless, this paper points out that Niu-Wang's protocol has the following efficiency and security problems: (1) The protocol has computational efficiency problem when a trusted third party decrypts the user sending message. (2) The protocol is vulnerable to Denial of Service (DoS) attack based on illegal message modification by an attacker.
Automatically measuring brain ventricular volume within PACS using artificial intelligence.
Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon
2018-01-01
The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.
CE microchips: an opened gate to food analysis.
Escarpa, Alberto; González, María Cristina; Crevillén, Agustín González; Blasco, Antonio Javier
2007-03-01
CE microchips are the first generation of micrototal analysis systems (-TAS) emerging in the miniaturization scene of food analysis. CE microchips for food analysis are fabricated in both glass and polymer materials, such as PDMS and poly(methyl methacrylate) (PMMA), and use simple layouts of simple and double T crosses. Nowadays, the detection route preferred is electrochemical in both, amperometry and conductivity modes, using end-channel and contactless configurations, respectively. Food applications using CE microchips are now emerging since food samples present complex matrices, the selectivity being a very important challenge because the total integration of analytical steps into microchip format is very difficult. As a consequence, the first contributions that have recently appeared in the relevant literature are based primarily on fast separations of analytes of high food significance. These protocols are combined with different strategies to achieve selectivity using a suitable nonextensive sample preparation and/or strategically choosing detection routes. Polyphenolic compounds, amino acids, preservatives, and organic and inorganic ions have been studied using CE microchips. Thus, new and exciting future expectations arise in the domain of food analysis. However, several drawbacks could easily be found and assumed within the miniaturization map.
Holthoff, Ellen L.; Stratis-Cullum, Dimitra N.; Hankus, Mikella E.
2011-01-01
We report on a new sensor strategy that integrates molecularly imprinted polymers (MIPs) with surface enhanced Raman scattering (SERS). The sensor was developed to detect the explosive, 2,4,6-trinitrotoluene (TNT). Micron thick films of sol gel-derived xerogels were deposited on a SERS-active surface as the sensing layer. Xerogels were molecularly imprinted for TNT using non-covalent interactions with the polymer matrix. Binding of the TNT within the polymer matrix results in unique SERS bands, which allow for detection and identification of the molecule in the MIP. This MIP-SERS sensor exhibits an apparent dissociation constant of (2.3 ± 0.3) × 10−5 M for TNT and a 3 μM detection limit. The response to TNT is reversible and the sensor is stable for at least 6 months. Key challenges, including developing a MIP formulation that is stable and integrated with the SERS substrate, and ensuring the MIP does not mask the spectral features of the target analyte through SERS polymer background, were successfully met. The results also suggest the MIP-SERS protocol can be extended to other target analytes of interest. PMID:22163761
Homogeneous Immunoassays: Historical Perspective and Future Promise
NASA Astrophysics Data System (ADS)
Ullman, Edwin F.
1999-06-01
The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.
Sastrawan, J; Jones, C; Akhalwaya, I; Uys, H; Biercuk, M J
2016-08-01
We introduce concepts from optimal estimation to the stabilization of precision frequency standards limited by noisy local oscillators. We develop a theoretical framework casting various measures for frequency standard variance in terms of frequency-domain transfer functions, capturing the effects of feedback stabilization via a time series of Ramsey measurements. Using this framework, we introduce an optimized hybrid predictive feedforward measurement protocol that employs results from multiple past measurements and transfer-function-based calculations of measurement covariance to improve the accuracy of corrections within the feedback loop. In the presence of common non-Markovian noise processes these measurements will be correlated in a calculable manner, providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate benefits in both correction accuracy and long-term oscillator stability using hybrid feedforward. Simulations verify that in the presence of uncompensated dead time and noise with significant spectral weight near the inverse cycle time predictive feedforward outperforms traditional feedback, providing a path towards developing a class of stabilization software routines for frequency standards limited by noisy local oscillators.
NASA Astrophysics Data System (ADS)
Jehlička, J.; Edwards, H. G. M.; Vítek, P.
2009-05-01
Several characteristic geological features found on the surface of Mars by planetary rovers suggest that a possible extinct biosphere could exist based on similar sources of energy as occurred on Earth. For this reason, analytical instrumental protocols for the detection of biomarkers in suitable geological matrices unequivocally have to be elaborated for future unmanned explorations including the forthcoming ESA ExoMars mission. As part of the Pasteur suite of analytical instrumentation on ExoMars, the Raman/LIBS instrument will seek elemental and molecular information about geological, biological and biogeological markers in the Martian record. A key series of experiments on terrestrial Mars analogues, of which this paper addresses a particularly important series of compounds, is required to obtain the Raman spectra of key molecules and crystals, which are characteristic for each biomarker. Here, we present Raman spectra of several examples of organic compounds which have been recorded non-destructively - higher n-alkanes, polycyclic aromatic hydrocarbons, carotenoids, salts of organic acids, pure crystalline terpenes as well as oxygen-containing organic compounds. In addition, the lower limit of β-carotene detection in sulphate matrices using Raman microspectroscopy was estimated.
Comment on "flexible protocol for quantum private query based on B92 protocol"
NASA Astrophysics Data System (ADS)
Chang, Yan; Zhang, Shi-Bin; Zhu, Jing-Min
2017-03-01
In a recent paper (Quantum Inf Process 13:805-813, 2014), a flexible quantum private query (QPQ) protocol based on B92 protocol is presented. Here we point out that the B92-based QPQ protocol is insecure in database security when the channel has loss, that is, the user (Alice) will know more records in Bob's database compared with she has bought.
Current Protocols in Pharmacology
2016-01-01
Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029
DOE Office of Scientific and Technical Information (OSTI.GOV)
Part, Florian; Zecha, Gudrun; Causon, Tim
Highlights: • First review on detection of nanomaterials in complex waste samples. • Focus on nanoparticles in solid, liquid and gaseous waste samples. • Summary of current applicable methods for nanowaste detection and characterisation. • Limitations and challenges of characterisation of nanoparticles in waste. - Abstract: Engineered nanomaterials (ENMs) are already extensively used in diverse consumer products. Along the life cycle of a nano-enabled product, ENMs can be released and subsequently accumulate in the environment. Material flow models also indicate that a variety of ENMs may accumulate in waste streams. Therefore, a new type of waste, so-called nanowaste, is generatedmore » when end-of-life ENMs and nano-enabled products are disposed of. In terms of the precautionary principle, environmental monitoring of end-of-life ENMs is crucial to allow assessment of the potential impact of nanowaste on our ecosystem. Trace analysis and quantification of nanoparticulate species is very challenging because of the variety of ENM types that are used in products and low concentrations of nanowaste expected in complex environmental media. In the framework of this paper, challenges in nanowaste characterisation and appropriate analytical techniques which can be applied to nanowaste analysis are summarised. Recent case studies focussing on the characterisation of ENMs in waste streams are discussed. Most studies aim to investigate the fate of nanowaste during incineration, particularly considering aerosol measurements; whereas, detailed studies focusing on the potential release of nanowaste during waste recycling processes are currently not available. In terms of suitable analytical methods, separation techniques coupled to spectrometry-based methods are promising tools to detect nanowaste and determine particle size distribution in liquid waste samples. Standardised leaching protocols can be applied to generate soluble fractions stemming from solid wastes, while micro- and ultrafiltration can be used to enrich nanoparticulate species. Imaging techniques combined with X-ray-based methods are powerful tools for determining particle size, morphology and screening elemental composition. However, quantification of nanowaste is currently hampered due to the problem to differentiate engineered from naturally-occurring nanoparticles. A promising approach to face these challenges in nanowaste characterisation might be the application of nanotracers with unique optical properties, elemental or isotopic fingerprints. At present, there is also a need to develop and standardise analytical protocols regarding nanowaste sampling, separation and quantification. In general, more experimental studies are needed to examine the fate and transport of ENMs in waste streams and to deduce transfer coefficients, respectively to develop reliable material flow models.« less
A 'smart' tube holder enables real-time sample monitoring in a standard lab centrifuge.
Hoang, Tony; Moskwa, Nicholas; Halvorsen, Ken
2018-01-01
The centrifuge is among the oldest and most widely used pieces of laboratory equipment, with significant applications that include clinical diagnostics and biomedical research. A major limitation of laboratory centrifuges is their "black box" nature, limiting sample observation to before and after centrifugation. Thus, optimized protocols require significant trial and error, while unoptimized protocols waste time by centrifuging longer than necessary or material due to incomplete sedimentation. Here, we developed an instrumented centrifuge tube receptacle compatible with several commercial benchtop centrifuges that can provide real-time sample analysis during centrifugation. We demonstrated the system by monitoring cell separations during centrifugation for different spin speeds, concentrations, buffers, cell types, and temperatures. We show that the collected data are valuable for analytical purposes (e.g. quality control), or as feedback to the user or the instrument. For the latter, we verified an adaptation where complete sedimentation turned off the centrifuge and notified the user by a text message. Our system adds new functionality to existing laboratory centrifuges, saving users time and providing useful feedback. This add-on potentially enables new analytical applications for an instrument that has remained largely unchanged for decades.
A ‘smart’ tube holder enables real-time sample monitoring in a standard lab centrifuge
Hoang, Tony; Moskwa, Nicholas
2018-01-01
The centrifuge is among the oldest and most widely used pieces of laboratory equipment, with significant applications that include clinical diagnostics and biomedical research. A major limitation of laboratory centrifuges is their “black box” nature, limiting sample observation to before and after centrifugation. Thus, optimized protocols require significant trial and error, while unoptimized protocols waste time by centrifuging longer than necessary or material due to incomplete sedimentation. Here, we developed an instrumented centrifuge tube receptacle compatible with several commercial benchtop centrifuges that can provide real-time sample analysis during centrifugation. We demonstrated the system by monitoring cell separations during centrifugation for different spin speeds, concentrations, buffers, cell types, and temperatures. We show that the collected data are valuable for analytical purposes (e.g. quality control), or as feedback to the user or the instrument. For the latter, we verified an adaptation where complete sedimentation turned off the centrifuge and notified the user by a text message. Our system adds new functionality to existing laboratory centrifuges, saving users time and providing useful feedback. This add-on potentially enables new analytical applications for an instrument that has remained largely unchanged for decades. PMID:29659624
Selective functionalisation of PDMS-based photonic lab on a chip for biosensing.
Ibarlucea, Bergoi; Fernández-Sánchez, César; Demming, Stefanie; Büttgenbach, Stephanus; Llobera, Andreu
2011-09-07
A comparative study of different approaches for the selective immobilisation of biomolecules on the surface of poly(dimethylsiloxane) (PDMS) is reported. The motivation of this work is to set a robust and reliable protocol for the easy implementation of a biosensor device in a PDMS-based photonic lab-on-a-chip (PhLoC). A hollow prism configuration, previously reported for the colorimetric detection of analytes was chosen for this study. Here, the inner walls of the hollow prism were initially modified by direct adsorption of either polyethylene glycol (PEG) or polyvinyl alcohol (PVA) linear polymers as well as by carrying out a light chemical oxidation step. All these processes introduced hydroxyl groups on the PDMS surface to a different extent. The hydroxyl groups were further silanised using a silane containing an aldehyde end-group. The interaction between this group and a primary amine moiety enabled the selective covalent attachment of a biomolecule on the PDMS surface. A thorough structural characterisation of the resulting modified-PDMS substrates was carried out by contact angle measurements, X-ray photoelectron spectroscopic (XPS) analysis and atomic force microscopy (AFM) imaging. Using horseradish peroxidase as a model recognition element, different biosensor approaches based on each modification process were developed for the detection of hydrogen peroxide target analyte in a concentration range from 0.1 µM to 100 µM. The analytical performance was similar in all cases, a linear concentration range between 0.1 µM and 24.2 µM, a sensitivity of 0.02 a.u. µM(-1) and a limit of detection around 0.1 µM were achieved. However, important differences were observed in the reproducibility of the devices as well as in their operational stability, which was studied over a period of up to two months. Considering all these studies, the PVA-modified approach appeared to be the most suitable one for the simple fabrication of a biosensor device integrated in a PDMS PhLoC.
Zhao, Xiaoyan; Qureshi, Ferhan; Eastman, P Scott; Manning, William C; Alexander, Claire; Robinson, William H; Hesterberg, Lyndal K
2012-04-30
Variability in pre-analytical blood sampling and handling can significantly impact results obtained in quantitative immunoassays. Understanding the impact of these variables is critical for accurate quantification and validation of biomarker measurements. Particularly, in the design and execution of large clinical trials, even small differences in sample processing and handling can have dramatic effects in analytical reliability, results interpretation, trial management and outcome. The effects of two common blood sampling methods (serum vs. plasma) and two widely-used serum handling methods (on the clot with ambient temperature shipping, "traditional", vs. centrifuged with cold chain shipping, "protocol") on protein and autoantibody concentrations were examined. Matched serum and plasma samples were collected from 32 rheumatoid arthritis (RA) patients representing a wide range of disease activity status. Additionally, a set of matched serum samples with two sample handling methods was collected. One tube was processed per manufacturer's instructions and shipped overnight on cold packs (protocol). The matched tube, without prior centrifugation, was simultaneously shipped overnight at ambient temperatures (traditional). Upon delivery, the traditional tube was centrifuged. All samples were subsequently aliquoted and frozen prior to analysis of protein and autoantibody biomarkers. Median correlation between paired serum and plasma across all autoantibody assays was 0.99 (0.98-1.00) with a median % difference of -3.3 (-7.5 to 6.0). In contrast, observed protein biomarker concentrations were significantly affected by sample types, with median correlation of 0.99 (0.33-1.00) and a median % difference of -10 (-55 to 23). When the two serum collection/handling methods were compared, the median correlation between paired samples for autoantibodies was 0.99 (0.91-1.00) with a median difference of 4%. In contrast, significant increases were observed in protein biomarker concentrations among certain biomarkers in samples processed with the 'traditional' method. Autoantibody quantification appears robust to both sample type (plasma vs. serum) and pre-analytical sample collection/handling methods (protocol vs. traditional). In contrast, for non-antibody protein biomarker concentrations, sample type had a significant impact; plasma samples generally exhibit decreased protein biomarker concentrations relative to serum. Similarly, sample handling significantly impacted the variability of protein biomarker concentrations. When biomarker concentrations are combined algorithmically into a single test score such as a multi-biomarker disease activity test for rheumatoid arthritis (MBDA), changes in protein biomarker concentrations may result in a bias of the score. These results illustrate the importance of characterizing pre-analytical methodology, sample type, sample processing and handling procedures for clinical testing in order to ensure test accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
Magnusson, R; Nordlander, T; Östin, A
2016-01-15
Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.
A Novel Quantum Solution to Privacy-Preserving Nearest Neighbor Query in Location-Based Services
NASA Astrophysics Data System (ADS)
Luo, Zhen-yu; Shi, Run-hua; Xu, Min; Zhang, Shun
2018-04-01
We present a cheating-sensitive quantum protocol for Privacy-Preserving Nearest Neighbor Query based on Oblivious Quantum Key Distribution and Quantum Encryption. Compared with the classical related protocols, our proposed protocol has higher security, because the security of our protocol is based on basic physical principles of quantum mechanics, instead of difficulty assumptions. Especially, our protocol takes single photons as quantum resources and only needs to perform single-photon projective measurement. Therefore, it is feasible to implement this protocol with the present technologies.
NASA Astrophysics Data System (ADS)
Newbury, Dale E.; Ritchie, Nicholas W. M.
2012-06-01
Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.
NASA Astrophysics Data System (ADS)
Sportelli, M. C.; Picca, R. A.; Manoli, K.; Re, M.; Pesce, E.; Tapfer, L.; Di Franco, C.; Cioffi, N.; Torsi, L.
2017-10-01
The analytical performance of bioelectronic devices is highly influenced by their fabrication methods. In particular, the final architecture of field-effect transistor biosensors combining spin-cast poly(3-hexylthiophene) (P3HT) film and a biomolecule interlayer deposited on a SiO2/Si substrate can lead to the development of highly performing sensing systems, such as for the case of streptavidin (SA) used for biotin sensing. To gain a better understanding of the quality of the interfacial area, critical is the assessment of the morphological features characteristic of the adopted biolayer deposition protocol, namely: the layer-by-layer (LbL) approach and the spin coating technique. The present study relies on a combined surface spectroscopic and morphological characterization. Specifically, X-ray photoelectron spectroscopy operated in the parallel angle-resolved mode allowed the non-destructive investigation of the in-depth chemical composition of the SA film, alone or in the presence of the P3HT overlayer. Spectroscopic data were supported and corroborated by the results obtained with a Scanning Electron and a Helium Ion microscope investigation performed on the SA layer that provided relevant information on the protein structural arrangement or on its surface morphology. Clear differences emerged between the SA layers prepared by the two approaches, with the layer-by-layer deposition resulting in a smoother and better defined bio-electronic interface. Such findings support the superior analytical performance shown by bioelectronic devices based on LbL-deposited protein layers over spin coated ones.
Remote software upload techniques in future vehicles and their performance analysis
NASA Astrophysics Data System (ADS)
Hossain, Irina
Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.
NASA Astrophysics Data System (ADS)
Mednova, Olga; Kirsanov, Dmitry; Rudnitskaya, Alisa; Kilmartin, Paul; Legin, Andrey
2009-05-01
The present study deals with a potentiometric electronic tongue (ET) multisensor system applied for the simultaneous determination of several chemical parameters for white wines produced in New Zealand. Methods in use for wine quality control are often expensive and require considerable time and skilled operation. The ET approach usually offers a simple and fast measurement protocol and allows automation for on-line analysis under industrial conditions. The ET device developed in this research is capable of quantifying the free and total SO2 content, total acids and some polyphenolic compounds in white wines with acceptable analytical errors.
A Simple Sonication Improves Protein Signal in Matrix-Assisted Laser Desorption Ionization Imaging
NASA Astrophysics Data System (ADS)
Lin, Li-En; Su, Pin-Rui; Wu, Hsin-Yi; Hsu, Cheng-Chih
2018-02-01
Proper matrix application is crucial in obtaining high quality matrix-assisted laser desorption ionization (MALDI) mass spectrometry imaging (MSI). Solvent-free sublimation was essentially introduced as an approach of homogeneous coating that gives small crystal size of the organic matrix. However, sublimation has lower extraction efficiency of analytes. Here, we present that a simple sonication step after the hydration in standard sublimation protocol significantly enhances the sensitivity of MALDI MSI. This modified procedure uses a common laboratory ultrasonicator to immobilize the analytes from tissue sections without noticeable delocalization. Improved imaging quality with additional peaks above 10 kDa in the spectra was thus obtained upon sonication treatment. [Figure not available: see fulltext.
Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.
Relativistic quantum private database queries
NASA Astrophysics Data System (ADS)
Sun, Si-Jia; Yang, Yu-Guang; Zhang, Ming-Ou
2015-04-01
Recently, Jakobi et al. (Phys Rev A 83, 022301, 2011) suggested the first practical private database query protocol (J-protocol) based on the Scarani et al. (Phys Rev Lett 92, 057901, 2004) quantum key distribution protocol. Unfortunately, the J-protocol is just a cheat-sensitive private database query protocol. In this paper, we present an idealized relativistic quantum private database query protocol based on Minkowski causality and the properties of quantum information. Also, we prove that the protocol is secure in terms of the user security and the database security.
Biometrics based authentication scheme for session initiation protocol.
Xie, Qi; Tang, Zhixiong
2016-01-01
Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Quantum key distribution protocol based on contextuality monogamy
NASA Astrophysics Data System (ADS)
Singh, Jaskaran; Bharti, Kishor; Arvind
2017-06-01
The security of quantum key distribution (QKD) protocols hinges upon features of physical systems that are uniquely quantum in nature. We explore the role of quantumness, as qualified by quantum contextuality, in a QKD scheme. A QKD protocol based on the Klyachko-Can-Binicioğlu-Shumovsky (KCBS) contextuality scenario using a three-level quantum system is presented. We explicitly show the unconditional security of the protocol by a generalized contextuality monogamy relationship based on the no-disturbance principle. This protocol provides a new framework for QKD which has conceptual and practical advantages over other protocols.
Ridde, Valéry; Turcotte-Tremblay, Anne-Marie; Souares, Aurélia; Lohmann, Julia; Zombré, David; Koulidiati, Jean Louis; Yaogo, Maurice; Hien, Hervé; Hunt, Matthew; Zongo, Sylvie; De Allegri, Manuela
2014-10-12
The low quality of healthcare and the presence of user fees in Burkina Faso contribute to low utilization of healthcare and elevated levels of mortality. To improve access to high-quality healthcare and equity, national authorities are testing different intervention arms that combine performance-based financing with community-based health insurance and pro-poor targeting. There is a need to evaluate the implementation of these unique approaches. We developed a research protocol to analyze the conditions that led to the emergence of these intervention arms, the fidelity between the activities initially planned and those conducted, the implementation and adaptation processes, the sustainability of the interventions, the possibilities for scaling them up, and their ethical implications. The study adopts a longitudinal multiple case study design with several embedded levels of analyses. To represent the diversity of contexts where the intervention arms are carried out, we will select three districts. Within districts, we will select both primary healthcare centers (n =18) representing different intervention arms and the district or regional hospital (n =3). We will select contrasted cases in relation to their initial performance (good, fair, poor). Over a period of 18 months, we will use quantitative and qualitative data collection and analytical tools to study these cases including in-depth interviews, participatory observation, research diaries, and questionnaires. We will give more weight to qualitative methods compared to quantitative methods. Performance-based financing is expanding rapidly across low- and middle-income countries. The results of this study will enable researchers and decision makers to gain a better understanding of the factors that can influence the implementation and the sustainability of complex interventions aiming to increase healthcare quality as well as equity.
Kestens, Yan; Chaix, Basile; Gerber, Philippe; Desprès, Michel; Gauvin, Lise; Klein, Olivier; Klein, Sylvain; Köppen, Bernhard; Lord, Sébastien; Naud, Alexandre; Payette, Hélène; Richard, Lucie; Rondier, Pierre; Shareck, Martine; Sueur, Cédric; Thierry, Benoit; Vallée, Julie; Wasfi, Rania
2016-05-05
Given the challenges of aging populations, calls have been issued for more sustainable urban re-development and implementation of local solutions to address global environmental and healthy aging issues. However, few studies have considered older adults' daily mobility to better understand how local built and social environments may contribute to healthy aging. Meanwhile, wearable sensors and interactive map-based applications offer novel means for gathering information on people's mobility, levels of physical activity, or social network structure. Combining such data with classical questionnaires on well-being, physical activity, perceived environments and qualitative assessment of experience of places opens new opportunities to assess the complex interplay between individuals and environments. In line with current gaps and novel analytical capabilities, this research proposes an international research agenda to collect and analyse detailed data on daily mobility, social networks and health outcomes among older adults using interactive web-based questionnaires and wearable sensors. Our study resorts to a battery of innovative data collection methods including use of a novel multisensor device for collection of location and physical activity, interactive map-based questionnaires on regular destinations and social networks, and qualitative assessment of experience of places. This rich data will allow advanced quantitative and qualitative analyses in the aim to disentangle the complex people-environment interactions linking urban local contexts to healthy aging, with a focus on active living, social networks and participation, and well-being. This project will generate evidence about what characteristics of urban environments relate to active mobility, social participation, and well-being, three important dimensions of healthy aging. It also sets the basis for an international research agenda on built environment and healthy aging based on a shared and comprehensive data collection protocol.
Synthesizing Existing CSMA and TDMA Based MAC Protocols for VANETs
Huang, Jiawei; Li, Qi; Zhong, Shaohua; Liu, Lianhai; Zhong, Ping; Wang, Jianxin; Ye, Jin
2017-01-01
Many Carrier Sense Multiple Access (CSMA) and Time Division Multiple Access (TDMA) based medium access control (MAC) protocols for vehicular ad hoc networks (VANETs) have been proposed recently. Contrary to the common perception that they are competitors, we argue that the underlying strategies used in these MAC protocols are complementary. Based on this insight, we design CTMAC, a MAC protocol that synthesizes existing strategies; namely, random accessing channel (used in CSMA-style protocols) and arbitral reserving channel (used in TDMA-based protocols). CTMAC swiftly changes its strategy according to the vehicle density, and its performance is better than the state-of-the-art protocols. We evaluate CTMAC using at-scale simulations. Our results show that CTMAC reduces the channel completion time and increases the network goodput by 45% for a wide range of application workloads and network settings. PMID:28208590
Colloidal Mechanisms of Gold Nanoparticle Loss in Asymmetric Flow Field-Flow Fractionation.
Jochem, Aljosha-Rakim; Ankah, Genesis Ngwa; Meyer, Lars-Arne; Elsenberg, Stephan; Johann, Christoph; Kraus, Tobias
2016-10-07
Flow field-flow fractionation is a powerful method for the analysis of nanoparticle size distributions, but its widespread use has been hampered by large analyte losses, especially of metal nanoparticles. Here, we report on the colloidal mechanisms underlying the losses. We systematically studied gold nanoparticles (AuNPs) during asymmetrical flow field-flow fractionation (AF4) by systematic variation of the particle properties and the eluent composition. Recoveries of AuNPs (core diameter 12 nm) stabilized by citrate or polyethylene glycol (PEG) at different ionic strengths were determined. We used online UV-vis detection and off-line elementary analysis to follow particle losses during full analysis runs, runs without cross-flow, and runs with parts of the instrument bypassed. The combination allowed us to calculate relative and absolute analyte losses at different stages of the analytic protocol. We found different loss mechanisms depending on the ligand. Citrate-stabilized particles degraded during analysis and suffered large losses (up to 74%). PEG-stabilized particles had smaller relative losses at moderate ionic strengths (1-20%) that depended on PEG length. Long PEGs at higher ionic strengths (≥5 mM) caused particle loss due to bridging adsorption at the membrane. Bulk agglomeration was not a relevant loss mechanism at low ionic strengths ≤5 mM for any of the studied particles. An unexpectedly large fraction of particles was lost at tubing and other internal surfaces. We propose that the colloidal mechanisms observed here are relevant loss mechanisms in many particle analysis protocols and discuss strategies to avoid them.
Interoperability through standardization: Electronic mail, and X Window systems
NASA Technical Reports Server (NTRS)
Amin, Ashok T.
1993-01-01
Since the introduction of computing machines, there has been continual advances in computer and communication technologies and approaching limits. The user interface has evolved from a row of switches, character based interface using teletype terminals and then video terminals, to present day graphical user interface. It is expected that next significant advances will come in the availability of services, such as electronic mail and directory services, as the standards for applications are developed and in the 'easy to use' interfaces, such as Graphical User Interface for example Window and X Window, which are being standardized. Various proprietary electronic mail (email) systems are in use within organizations at each center of NASA. Each system provides email services to users within an organization, however the support for email services across organizations and across centers exists at centers to a varying degree and is often easy to use. A recent NASA email initiative is intended 'to provide a simple way to send email across organizational boundaries without disruption of installed base.' The initiative calls for integration of existing organizational email systems through gateways connected by a message switch, supporting X.400 and SMTP protocols, to create a NASA wide email system and for implementation of NASA wide email directory services based on OSI standard X.500. A brief overview of MSFC efforts as a part of this initiative are described. Window based graphical user interfaces make computers easy to use. X window protocol has been developed at Massachusetts Institute of Technology in 1984/1985 to provide uniform window based interface in a distributed computing environment with heterogenous computers. It has since become a standard supported by a number of major manufacturers. Z Windows systems, terminals and workstations, and X Window applications are becoming available. However impact of its use in the Local Area Network environment on the network traffic are not well understood. It is expected that the use of X Windows systems will increase at MSFC especially for Unix based systems. An overview of X Window protocol is presented and its impact on the network traffic is examined. It is proposed that an analytical model of X Window systems in the network environment be developed and validated through the use of measurements to generate application and user profiles.
A Survey on the Taxonomy of Cluster-Based Routing Protocols for Homogeneous Wireless Sensor Networks
Naeimi, Soroush; Ghafghazi, Hamidreza; Chow, Chee-Onn; Ishii, Hiroshi
2012-01-01
The past few years have witnessed increased interest among researchers in cluster-based protocols for homogeneous networks because of their better scalability and higher energy efficiency than other routing protocols. Given the limited capabilities of sensor nodes in terms of energy resources, processing and communication range, the cluster-based protocols should be compatible with these constraints in either the setup state or steady data transmission state. With focus on these constraints, we classify routing protocols according to their objectives and methods towards addressing the shortcomings of clustering process on each stage of cluster head selection, cluster formation, data aggregation and data communication. We summarize the techniques and methods used in these categories, while the weakness and strength of each protocol is pointed out in details. Furthermore, taxonomy of the protocols in each phase is given to provide a deeper understanding of current clustering approaches. Ultimately based on the existing research, a summary of the issues and solutions of the attributes and characteristics of clustering approaches and some open research areas in cluster-based routing protocols that can be further pursued are provided. PMID:22969350
Naeimi, Soroush; Ghafghazi, Hamidreza; Chow, Chee-Onn; Ishii, Hiroshi
2012-01-01
The past few years have witnessed increased interest among researchers in cluster-based protocols for homogeneous networks because of their better scalability and higher energy efficiency than other routing protocols. Given the limited capabilities of sensor nodes in terms of energy resources, processing and communication range, the cluster-based protocols should be compatible with these constraints in either the setup state or steady data transmission state. With focus on these constraints, we classify routing protocols according to their objectives and methods towards addressing the shortcomings of clustering process on each stage of cluster head selection, cluster formation, data aggregation and data communication. We summarize the techniques and methods used in these categories, while the weakness and strength of each protocol is pointed out in details. Furthermore, taxonomy of the protocols in each phase is given to provide a deeper understanding of current clustering approaches. Ultimately based on the existing research, a summary of the issues and solutions of the attributes and characteristics of clustering approaches and some open research areas in cluster-based routing protocols that can be further pursued are provided.
2017-01-01
Background Mobile device-based ecological momentary assessment (mobile-EMA) is increasingly used to collect participants' data in real-time and in context. Although EMA offers methodological advantages, these advantages can be diminished by participant noncompliance. However, evidence on how well participants comply with mobile-EMA protocols and how study design factors associated with participant compliance is limited, especially in the youth literature. Objective To systematically and meta-analytically examine youth’s compliance to mobile-EMA protocols and moderators of participant compliance in clinical and nonclinical settings. Methods Studies using mobile devices to collect EMA data among youth (age ≤18 years old) were identified. A systematic review was conducted to describe the characteristics of mobile-EMA protocols and author-reported factors associated with compliance. Random effects meta-analyses were conducted to estimate the overall compliance across studies and to explore factors associated with differences in youths’ compliance. Results This review included 42 unique studies that assessed behaviors, subjective experiences, and contextual information. Mobile phones were used as the primary mode of EMA data collection in 48% (20/42) of the reviewed studies. In total, 12% (5/42) of the studies used wearable devices in addition to the EMA data collection platforms. About half of the studies (62%, 24/42) recruited youth from nonclinical settings. Most (98%, 41/42) studies used a time-based sampling protocol. Among these studies, most (95%, 39/41) prompted youth 2-9 times daily, for a study length ranging from 2-42 days. Sampling frequency and study length did not differ between studies with participants from clinical versus nonclinical settings. Most (88%, 36/41) studies with a time-based sampling protocol defined compliance as the proportion of prompts to which participants responded. In these studies, the weighted average compliance rate was 78.3%. The average compliance rates were not different between studies with clinical (76.9%) and nonclinical (79.2%; P=.29) and studies that used only a mobile-EMA platform (77.4%) and mobile platform plus additional wearable devices (73.0%, P=.36). Among clinical studies, the mean compliance rate was significantly lower in studies that prompted participants 2-3 times (73.5%) or 4-5 times (66.9%) compared with studies with a higher sampling frequency (6+ times: 89.3%). Among nonclinical studies, a higher average compliance rate was observed in studies that prompted participants 2-3 times daily (91.7%) compared with those that prompted participants more frequently (4-5 times: 77.4%; 6+ times: 75.0%). The reported compliance rates did not differ by duration of EMA period among studies from either clinical or nonclinical settings. Conclusions The compliance rate among mobile-EMA studies in youth is moderate but suboptimal. Study design may affect protocol compliance differently between clinical and nonclinical participants; including additional wearable devices did not affect participant compliance. A more consistent compliance-related result reporting practices can facilitate understanding and improvement of participant compliance with EMA data collection among youth. PMID:28446418
A Comparison of EFL Raters' Essay-Rating Processes across Two Types of Rating Scales
ERIC Educational Resources Information Center
Li, Hang; He, Lianzhen
2015-01-01
This study used think-aloud protocols to compare essay-rating processes across holistic and analytic rating scales in the context of China's College English Test Band 6 (CET-6). A group of 9 experienced CET-6 raters scored the same batch of 10 CET-6 essays produced in an operational CET-6 administration twice, using both the CET-6 holistic…
ERIC Educational Resources Information Center
Smagorinsky, Peter; Daigle, Elizabeth Anne; O'Donnell-Allen, Cindy; Bynum, Susan
2010-01-01
This article reports a study of one high school senior's process of academic bullshitting as she wrote an analytic essay interpreting Shakespeare's "Much Ado about Nothing." The construct of bullshit has received little scholarly attention; although it is known as a common phenomenon in academic speech and writing, it has rarely been the subject…
Donald C. Buso; Gene E. Likens; John S. Eaton
2000-01-01
The Hubbard Brook Ecosystem Study (HBES), begun in 1963, is a long-term effort to understand the structure, function and change in forest watersheds and associated aquatic ecosystems at the Hubbard Brook Experimental Forest in New Hampshire. Chemical analyses of streamwater and precipitation collections began in 1963, and analyses of lakewater collections began in 1967...
Hanousek, Ondrej; Berger, Torsten W; Prohaska, Thomas
2016-01-01
Analysis of (34)S/(32)S of sulfate in rainwater and soil solutions can be seen as a powerful tool for the study of the sulfur cycle. Therefore, it is considered as a useful means, e.g., for amelioration and calibration of ecological or biogeochemical models. Due to several analytical limitations, mainly caused by low sulfate concentration in rainwater, complex matrix of soil solutions, limited sample volume, and high number of samples in ecosystem studies, a straightforward analytical protocol is required to provide accurate S isotopic data on a large set of diverse samples. Therefore, sulfate separation by anion exchange membrane was combined with precise isotopic measurement by multicollector inductively coupled plasma mass spectrometry (MC ICP-MS). The separation method proved to be able to remove quantitatively sulfate from matrix cations (Ca, K, Na, or Li) which is a precondition in order to avoid a matrix-induced analytical bias in the mass spectrometer. Moreover, sulfate exchange on the resin is capable of preconcentrating sulfate from low concentrated solutions (to factor 3 in our protocol). No significant sulfur isotope fractionation was observed during separation and preconcentration. MC ICP-MS operated at edge mass resolution has enabled the direct (34)S/(32)S analysis of sulfate eluted from the membrane, with an expanded uncertainty U (k = 2) down to 0.3 ‰ (a single measurement). The protocol was optimized and validated using different sulfate solutions and different matrix compositions. The optimized method was applied in a study on solute samples retrieved in a beech (Fagus sylvatica) forest in the Vienna Woods. Both rainwater (precipitation and tree throughfall) and soil solution δ (34)SVCDT ranged between 4 and 6 ‰, the ratio in soil solution being slightly lower. The lower ratio indicates that a considerable portion of the atmospherically deposited sulfate is cycled through the organic S pool before being released to the soil solution. Nearly the same trends and variations were observed in soil solution and rainwater δ (34)SVCDT values showing that sulfate adsorption/desorption are not important processes in the studied soil.
Gilbert-López, Bienvenida; García-Reyes, Juan F; Lozano, Ana; Fernández-Alba, Amadeo R; Molina-Díaz, Antonio
2010-09-24
In this work we have evaluated the performance of two sample preparation methodologies for the large-scale multiresidue analysis of pesticides in olives using liquid chromatography-electrospray tandem mass spectrometry (LC-MS/MS). The tested sample treatment methodologies were: (1) liquid-liquid partitioning with acetonitrile followed by dispersive solid-phase extraction clean-up using GCB, PSA and C18 sorbents (QuEChERS method - modified for fatty vegetables) and (2) matrix solid-phase dispersion (MSPD) using aminopropyl as sorbent material and a final clean-up performed in the elution step using Florisil. An LC-MS/MS method covering 104 multiclass pesticides was developed to examine the performance of these two protocols. The separation of the compounds from the olive extracts was achieved using a short C18 column (50 mm x 4.6 mm i.d.) with 1.8 microm particle size. The identification and confirmation of the compounds was based on retention time matching along with the presence (and ratio) of two typical MRM transitions. Limits of detection obtained were lower than 10 microgkg(-1) for 89% analytes using both sample treatment protocols. Recoveries studies performed on olives samples spiked at two concentration levels (10 and 100 microgkg(-1)) yielded average recoveries in the range 70-120% for most analytes when QuEChERS procedure is employed. When MSPD was the choice for sample extraction, recoveries obtained were in the range 50-70% for most of target compounds. The proposed methods were successfully applied to the analysis of real olives samples, revealing the presence of some of the target species in the microgkg(-1) range. Besides the evaluation of the sample preparation approaches, we also discuss the use of advanced software features associated to MRM method development that overcome several limitations and drawbacks associated to MS/MS methods (time segments boundaries, tedious method development/manual scheduling and acquisition limitations). This software feature recently offered by different vendors is based on an algorithm that associates retention time data for each individual MS/MS transition, so that the number of simultaneously traced transitions throughout the entire chromatographic run (dwell times and sensitivity) is maximized. Copyright 2010 Elsevier B.V. All rights reserved.
Detection of Banned and Restricted Ozone-Depleting Chemicals in Printed Circuit Boards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Richard N.; Wright, Bob W.
2008-12-01
A study directed toward the detection of halogenated solvents in the matrix of circuit boards has recently been completed. This work was undertaken to demonstrate the potential for reliable detection of solvents used during the fabrication of printed circuit boards (PCB). Since many of these solvents are now, or soon will be, restricted under the terms of legislation enacted in response to the Montreal Protocol and other international agreements, the work described here, conducted over a period of more that 4 years, has provided guidance for the development of chromatographic system and analytical protocol to assure compliance with regulations introducedmore » to control, or ban, industrial solvents associated with adverse environmental impact.« less
Performance Analysis of IEEE 802.15.3 MAC Protocol with Different ACK Polices
NASA Astrophysics Data System (ADS)
Mehta, S.; Kwak, K. S.
The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed, specially, for short range high data rates applications, to coordinate the access to the wireless medium among the competing devices. This paper uses analytical model to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various acknowledgment schemes under different parameters. Also, some important observations are obtained, which can be very useful to the protocol architectures. Finally, we come up with some important research issues to further investigate the possible improvements in the WPAN MAC.
Tzeng, Yan-Kai; Chang, Cheng-Chun; Huang, Chien-Ning; Wu, Chih-Che; Han, Chau-Chung; Chang, Huan-Cheng
2008-09-01
A streamlined protocol has been developed to accelerate, simplify, and enhance matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) mass spectrometry (MS) of neutral underivatized glycans released from glycoproteins. It involved microwave-assisted enzymatic digestion and release of glycans, followed by rapid removal of proteins and peptides with carboxylated/oxidized diamond nanoparticles, and finally treating the analytes with NaOH before mixing them with acidic matrix (such as 2,5-dihydroxybenzoic acid) to suppress the formation of both peptide and potassiated oligosaccharide ions in MS analysis. The advantages of this protocol were demonstrated with MALDI-TOF-MS of N-linked glycans released from ovalbumin and ribonuclease B.
NASA Astrophysics Data System (ADS)
Zdravković, Nemanja; Cvetkovic, Aleksandra; Milic, Dejan; Djordjevic, Goran T.
2017-09-01
This paper analyses end-to-end packet error rate (PER) of a free-space optical decode-and-forward cooperative network over a gamma-gamma atmospheric turbulence channel in the presence of temporary random link blockage. Closed-form analytical expressions for PER are derived for the cases with and without transmission links being prone to blockage. Two cooperation protocols (denoted as 'selfish' and 'pilot-adaptive') are presented and compared, where the latter accounts for the presence of blockage and adapts transmission power. The influence of scintillation, link distance, average transmitted signal power, network topology and probability of an uplink and/or internode link being blocked are discussed when the destination applies equal gain combining. The results show that link blockage caused by obstacles can degrade system performance, causing an unavoidable PER floor. The implementation of the pilot-adaptive protocol improves performance when compared to the selfish protocol, diminishing internode link blockage and lowering the PER floor, especially for larger networks.
Simple proof of security of the BB84 quantum key distribution protocol
Shor; Preskill
2000-07-10
We prove that the 1984 protocol of Bennett and Brassard (BB84) for quantum key distribution is secure. We first give a key distribution protocol based on entanglement purification, which can be proven secure using methods from Lo and Chau's proof of security for a similar protocol. We then show that the security of this protocol implies the security of BB84. The entanglement purification based protocol uses Calderbank-Shor-Steane codes, and properties of these codes are used to remove the use of quantum computation from the Lo-Chau protocol.
Nassar, Ala F; Wisnewski, Adam V; Raddassi, Khadir
2017-03-01
Analysis of multiplexed assays is highly important for clinical diagnostics and other analytical applications. Mass cytometry enables multi-dimensional, single-cell analysis of cell type and state. In mass cytometry, the rare earth metals used as reporters on antibodies allow determination of marker expression in individual cells. Barcode-based bioassays for CyTOF are able to encode and decode for different experimental conditions or samples within the same experiment, facilitating progress in producing straightforward and consistent results. Herein, an integrated protocol for automated sample preparation for barcoding used in conjunction with mass cytometry for clinical bioanalysis samples is described; we offer results of our work with barcoding protocol optimization. In addition, we present some points to be considered in order to minimize the variability of quantitative mass cytometry measurements. For example, we discuss the importance of having multiple populations during titration of the antibodies and effect of storage and shipping of labelled samples on the stability of staining for purposes of CyTOF analysis. Data quality is not affected when labelled samples are stored either frozen or at 4 °C and used within 10 days; we observed that cell loss is greater if cells are washed with deionized water prior to shipment or are shipped in lower concentration. Once the labelled samples for CyTOF are suspended in deionized water, the analysis should be performed expeditiously, preferably within the first hour. Damage can be minimized if the cells are resuspended in phosphate-buffered saline (PBS) rather than deionized water while waiting for data acquisition.
Golestani, Mina; Eshghi, Peyman; Rasekh, Hamid Reza; Cheraghali, Abdoll Majid; Salamzadeh, Jamshid; Naderi, Majid; Managhchi, Mohammad Reza; Hoorfar, Hamid; Toogeh, Gholam Reza; Imani, Ali; Khodayari, Mohammad Taghi; Habibpanah, Behnaz; Hantooshzadeh, Razieh
2016-01-01
Nowadays, bypassing agents such as recombinant activated factor VII (rFVIIa) and activated prothrombin complex concentrates (aPCC) are used to treat bleeding episodes in the Hemophilia patients with inhibitors. AryoSeven® is an Iranian biogeneric rFVIIa with homogeneity of efficacy and the nature to NovoSeven in a comparative trial. The current clinical trial aimed to evaluate the cost-effectiveness of FEIBA and AryoSeven® by Decision Analytic Model according to the Iranian healthcare system. An open label, multi-center, cross-over clinical trial was designed. Patients were categorized into 3 groups based on their prior tendency to one or none of the products. To determine the premium therapeutic strategy, the Incremental cost-effectiveness ratio (ICER) was calculated. Protocol F led to more treatment success in group F than the other groups (P= 0.03). Also, there was a significant statistical difference between the mean of effectiveness scores in the groups using protocol F (P = 0.01). The effectiveness of protocol F and A were 89% and 72%, respectively. ICER cost US$ to manage an episode of bleeding to get one more unit of effectiveness using FEIBA VS. AryoSeven. Although the results showed that AryoSeven was more cost-effective compared to FEIBA, the two strategies were undominated. In other words, both medicines can be applied in the first line of the treatment if the cost of FEIBA was reduced. The present clinical trial was registered at IRCT website, under ID No.2013020612380N1.
Measurement Challenges for Carbon Nanotube Material
NASA Technical Reports Server (NTRS)
Sosa, Edward; Arepalli, Sivaram; Nikolaev, Pasha; Gorelik, Olga; Yowell, Leonard
2006-01-01
The advances in large scale applications of carbon nanotubes demand a reliable supply of raw and processed materials. It is imperative to have a consistent quality control of these nanomaterials to distinguish material inconsistency from the modifications induced by processing of nanotubes for any application. NASA Johnson Space Center realized this need five years back and started a program to standardize the characterization methods. The JSC team conducted two workshops (2003 and 2005) in collaboration with NIST focusing on purity and dispersion measurement issues of carbon nanotubes [1]. In 2004, the NASA-JSC protocol was developed by combining analytical techniques of SEM, TEM, UV-VIS-NIR absorption, Raman, and TGA [2]. This protocol is routinely used by several researchers across the world as a first step in characterizing raw and purified carbon nanotubes. A suggested practice guide consisting of detailed chapters on TGA, Raman, electron microscopy and NIR absorption is in the final stages and is undergoing revisions with input from the nanotube community [3]. The possible addition of other techniques such as XPS, and ICP to the existing protocol will be presented. Recent activities at ANSI and ISO towards implementing these protocols as nanotube characterization standards will be discussed.
Popping, Bert; Allred, Laura; Bourdichon, François; Brunner, Kurt; Diaz-Amigo, Carmen; Galan-Malo, Patricia; Lacorn, Markus; North, Jennifer; Parisi, Salvatore; Rogers, Adrian; Sealy-Voyksner, Jennifer; Thompson, Tricia; Yeung, Jupiter
2018-01-01
Until recently, analytical tests for food were performed primarily in laboratories, but technical developments now enable consumers to use devices to test their food at home or when dining out. Current consumer devices for food can determine nutritional values, freshness, and, most recently, the presence of food allergens and substances that cause food intolerances. The demand for such products is driven by an increase in the incidence of food allergies, as well as consumer desire for more information about what is in their food. The number and complexity of food matrixes creates an important need for properly validated testing devices with comprehensive user instructions (definitions of technical terms can be found in ISO 5725-1:1994 and the International Vocabulary of Metrology). This is especially important with food allergen determinations that can have life-threatening consequences. Stakeholders-including food regulators, food producers, and food testing kit and equipment manufacturers, as well as representatives from consumer advocacy groups-have worked to outline voluntary guidelines for consumer food allergen- and gluten-testing devices. These guidelines cover areas such as kit validation, user sampling instructions, kit performance, and interpretation of results. The recommendations are based on (1) current known technologies, (2) analytical expertise, and (3) standardized AOAC INTERNATIONAL allergen community guidance and best practices on the analysis of food allergens and gluten. The present guidance document is the first in a series of papers intended to provide general guidelines applicable to consumer devices for all food analytes. Future publications will give specific guidance and validation protocols for devices designed to detect individual allergens and gluten, as statistical analysis and review of any validation data, preferably from an independent third party, are necessary to establish a device's fitness-for-purpose. Following the recommendations of these guidance documents will help ensure that consumers are equipped with sufficient information to make an informed decision based on an analytical result from a consumer device. However, the present guidance document emphasizes that consumer devices should not be used in isolation to make a determination as to whether a food is safe to eat. As advances are made in science and technology, these recommendations will be reevaluated and revised as appropriate.
Smurthwaite, Cameron A; Hilton, Brett J; O'Hanlon, Ryan; Stolp, Zachary D; Hancock, Bryan M; Abbadessa, Darin; Stotland, Aleksandr; Sklar, Larry A; Wolkowicz, Roland
2014-01-01
The discovery of the green fluorescent protein from Aequorea victoria has revolutionized the field of cell and molecular biology. Since its discovery a growing panel of fluorescent proteins, fluorophores and fluorescent-coupled staining methodologies, have expanded the analytical capabilities of flow cytometry. Here, we exploit the power of genetic engineering to barcode individual cells with genes encoding fluorescent proteins. For genetic engineering, we utilize retroviral technology, which allows for the expression of ectopic genetic information in a stable manner in mammalian cells. We have genetically barcoded both adherent and nonadherent cells with different fluorescent proteins. Multiplexing power was increased by combining both the number of distinct fluorescent proteins, and the fluorescence intensity in each channel. Moreover, retroviral expression has proven to be stable for at least a 6-month period, which is critical for applications such as biological screens. We have shown the applicability of fluorescent barcoded multiplexing to cell-based assays that rely themselves on genetic barcoding, or on classical staining protocols. Fluorescent genetic barcoding gives the cell an inherited characteristic that distinguishes it from its counterpart. Once cell lines are developed, no further manipulation or staining is required, decreasing time, nonspecific background associated with staining protocols, and cost. The increasing number of discovered and/or engineered fluorescent proteins with unique absorbance/emission spectra, combined with the growing number of detection devices and lasers, increases multiplexing versatility, making fluorescent genetic barcoding a powerful tool for flow cytometry-based analysis. © 2013 International Society for Advancement of Cytometry.
On the security of a simple three-party key exchange protocol without server's public keys.
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol.
On the Security of a Simple Three-Party Key Exchange Protocol without Server's Public Keys
Nam, Junghyun; Choo, Kim-Kwang Raymond; Park, Minkyu; Paik, Juryon; Won, Dongho
2014-01-01
Authenticated key exchange protocols are of fundamental importance in securing communications and are now extensively deployed for use in various real-world network applications. In this work, we reveal major previously unpublished security vulnerabilities in the password-based authenticated three-party key exchange protocol according to Lee and Hwang (2010): (1) the Lee-Hwang protocol is susceptible to a man-in-the-middle attack and thus fails to achieve implicit key authentication; (2) the protocol cannot protect clients' passwords against an offline dictionary attack; and (3) the indistinguishability-based security of the protocol can be easily broken even in the presence of a passive adversary. We also propose an improved password-based authenticated three-party key exchange protocol that addresses the security vulnerabilities identified in the Lee-Hwang protocol. PMID:25258723
Ichihara, Kiyoshi; Ceriotti, Ferruccio; Tam, Tran Huu; Sueyoshi, Shigeo; Poon, Priscilla M K; Thong, Mee Ling; Higashiuesato, Yasushi; Wang, Xuejing; Kataoka, Hiromi; Matsubara, Akemi; Shiesh, Shu-Chu; Muliaty, Dewi; Kim, Jeong-Ho; Watanabe, Masakazu; Lam, Christopher W K; Siekmann, Lothar; Lopez, Joseph B; Panteghini, Mauro
2013-07-01
A multicenter study conducted in Southeast Asia to derive reference intervals (RIs) for 72 commonly measured analytes (general chemistry, inflammatory markers, hormones, etc.) featured centralized measurement to clearly detect regionality in test results. The results of 31 standardized analytes are reported, with the remaining analytes presented in the next report. The study included 63 clinical laboratories from South Korea, China, Vietnam, Malaysia, Indonesia, and seven areas in Japan. A total of 3541 healthy individuals aged 20-65 years (Japan 2082, others 1459) were recruited mostly from hospital workers using a well-defined common protocol. All serum specimens were transported to Tokyo at -80°C and collectively measured using reagents from four manufacturers. Three-level nested ANOVA was used to quantitate variation (SD) of test results due to region, sex, and age. A ratio of SD for a given factor over residual SD (representing net between-individual variations) (SDR) exceeding 0.3 was considered significant. Traceability of RIs was ensured by recalibration using value-assigned reference materials. RIs were derived parametrically. SDRs for sex and age were significant for 19 and 16 analytes, respectively. Regional difference was significant for 11 analytes, including high density lipoprotein (HDL)-cholesterol and inflammatory markers. However, when the data were limited to those from Japan, regionality was not observed in any of the analytes. Accordingly, RIs were derived with or without partition by sex and region. RIs applicable to a wide area in Asia were established for the majority of analytes with traceability to reference measuring systems, whereas regional partitioning was required for RIs of the other analytes.
A secure RFID-based WBAN for healthcare applications.
Ullah, Sana; Alamri, Atif
2013-10-01
A Wireless Body Area Network (WBAN) allows the seamless integration of small and intelligent invasive or non-invasive sensor nodes in, on or around a human body for continuous health monitoring. These nodes are expected to use different power-efficient protocols in order to extend the WBAN lifetime. This paper highlights the power consumption and security issues of WBAN for healthcare applications. Numerous power saving mechanisms are discussed and a secure RFID-based protocol for WBAN is proposed. The performance of the proposed protocol is analyzed and compared with that of IEEE 802.15.6-based CSMA/CA and preamble-based TDMA protocols using extensive simulations. It is shown that the proposed protocol is power-efficient and protects patients' data from adversaries. It is less vulnerable to different attacks compared to that of IEEE 802.15.6-based CSMA/CA and preamble-based TDMA protocols. For a low traffic load and a single alkaline battery of capacity 2.6 Ah, the proposed protocol could extend the WBAN lifetime, when deployed on patients in hospitals or at homes, to approximately five years.
O'Neill-Kerr, Alex; Yassin, Anhar; Rogers, Stephen; Cornish, Janie
2017-09-01
The aim of this study was to test the proposition that adoption of a dose titration protocol may be associated with better patient outcomes, at lower treatment dose, and with comparable cumulative dose to that in patients treated using an age-based stimulus dosing protocol. This was an analysis of data assembled from archived records and based on cohorts of patients treated respectively on an age-based stimulus dosing protocol and on a dose titration protocol in the National Health Service in England. We demonstrated a significantly better response in the patient cohort treated with dose titration than with age-based stimulus dosing. Peak doses were less and the total cumulative dose was less in the dose titration group than in the age-based stimulus dosing group. Our findings are consistent with superior outcomes in patients treated using a dose titration protocol when compared with age-based stimulus dosing in a similar cohort of patients.
Neuroimaging in psychiatric pharmacogenetics research: the promise and pitfalls.
Falcone, Mary; Smith, Ryan M; Chenoweth, Meghan J; Bhattacharjee, Abesh Kumar; Kelsoe, John R; Tyndale, Rachel F; Lerman, Caryn
2013-11-01
The integration of research on neuroimaging and pharmacogenetics holds promise for improving treatment for neuropsychiatric conditions. Neuroimaging may provide a more sensitive early measure of treatment response in genetically defined patient groups, and could facilitate development of novel therapies based on an improved understanding of pathogenic mechanisms underlying pharmacogenetic associations. This review summarizes progress in efforts to incorporate neuroimaging into genetics and treatment research on major psychiatric disorders, such as schizophrenia, major depressive disorder, bipolar disorder, attention-deficit/hyperactivity disorder, and addiction. Methodological challenges include: performing genetic analyses in small study populations used in imaging studies; inclusion of patients with psychiatric comorbidities; and the extensive variability across studies in neuroimaging protocols, neurobehavioral task probes, and analytic strategies. Moreover, few studies use pharmacogenetic designs that permit testing of genotype × drug effects. As a result of these limitations, few findings have been fully replicated. Future studies that pre-screen participants for genetic variants selected a priori based on drug metabolism and targets have the greatest potential to advance the science and practice of psychiatric treatment.
Ki67 and proliferation in breast cancer.
Pathmanathan, Nirmala; Balleine, Rosemary L
2013-06-01
New approaches to the prognostic assessment of breast cancer have come from molecular profiling studies. A major feature of this work has been to emphasise the importance of cancer cell proliferation as a key discriminative indicator of recurrence risk for oestrogen receptor positive breast cancer in particular. Mitotic count scoring, as a component of histopathological grade, has long formed part of a routine evaluation of breast cancer biology. However, there is an increasingly compelling case to include a specific proliferation score in breast cancer pathology reports based on expression of the cell cycle regulated protein Ki67. Immunohistochemical staining for Ki67 is a widely available and economical test with good tolerance of pre-analytical variations and staining conditions. However, there is currently no evidence based protocol established to derive a reliable and informative Ki67 score for routine clinical use. In this circumstance, pathologists must establish a standardised framework for scoring Ki67 and communicating results to a multidisciplinary team.
Development of an evaporation-based microfluidic sample concentrator
NASA Astrophysics Data System (ADS)
Sharma, Nigel R.; Lukyanov, Anatoly; Bardell, Ron L.; Seifried, Lynn; Shen, Mingchao
2008-02-01
MicroPlumbers Microsciences LLC, has developed a relatively simple concentrator device based on isothermal evaporation. The device allows for rapid concentration of dissolved or dispersed substances or microorganisms (e.g. bacteria, viruses, proteins, toxins, enzymes, antibodies, etc.) under conditions gentle enough to preserve their specific activity or viability. It is capable of removing of 0.8 ml of water per minute at 37°C, and has dimensions compatible with typical microfluidic devices. The concentrator can be used as a stand-alone device or integrated into various processes and analytical instruments, substantially increasing their sensitivity while decreasing processing time. The evaporative concentrator can find applications in many areas such as biothreat detection, environmental monitoring, forensic medicine, pathogen analysis, and agricultural industrial monitoring. In our presentation, we describe the design, fabrication, and testing of the concentrator. We discuss multiphysics simulations of the heat and mass transport in the device that we used to select the design of the concentrator and the protocol of performance testing. We present the results of experiments evaluating water removal performance.
Rowlinson, Steve; Jia, Yunyan Andrea
2014-04-01
Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.
Quantitative electrochemical metalloimmunoassay for TFF3 in urine using a paper analytical device.
DeGregory, Paul R; Tsai, Yi-Ju; Scida, Karen; Richards, Ian; Crooks, Richard M
2016-03-07
We report a paper-based assay platform for the detection of the kidney disease marker Trefoil Factor 3 (TFF3) in human urine. The sensor is based on a quantitative metalloimmunoassay that can determine TFF3 concentrations via electrochemical detection of environmentally stable silver nanoparticle (AgNP) labels attached to magnetic microbeads via a TFF3 immunosandwich. The paper electroanalytical device incorporates two preconcentration steps that make it possible to detect concentrations of TFF3 in human urine at the low end of the target TFF3 concentration range (0.03-7.0 μg mL(-1)). Importantly, the paper device provides a level of accuracy for TFF3 determination in human urine equivalent to that of a commercial kit. The paper sensor has a dynamic range of ∼2.5 orders of magnitude, only requires a simple, one-step incubation protocol, and is fast, requiring only 10 min to complete. The cost of the materials at the prototypic laboratory scale, excluding reagents, is just US$0.42.
Marti, Guillaume; Boccard, Julien; Mehl, Florence; Debrus, Benjamin; Marcourt, Laurence; Merle, Philippe; Delort, Estelle; Baroux, Lucie; Sommer, Horst; Rudaz, Serge; Wolfender, Jean-Luc
2014-05-01
The detailed characterization of cold-pressed lemon oils (CPLOs) is of great importance for the flavor and fragrance (F&F) industry. Since a control of authenticity by standard analytical techniques can be bypassed using elaborated adulterated oils to pretend a higher quality, a combination of advanced orthogonal methods has been developed. The present study describes a combined metabolomic approach based on UHPLC-TOF-MS profiling and (1)H NMR fingerprinting to highlight metabolite differences on a set of representative samples used in the F&F industry. A new protocol was set up and adapted to the use of CPLO residues. Multivariate analysis based on both fingerprinting methods showed significant chemical variations between Argentinian and Italian samples. Discriminating markers identified in mixtures belong to furocoumarins, flavonoids, terpenoids and fatty acids. Quantitative NMR revealed low citropten and high bergamottin content in Italian samples. The developed metabolomic approach applied to CPLO residues gives some new perspectives for authenticity assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.
2015-12-01
We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.
Performance Modeling of Network-Attached Storage Device Based Hierarchical Mass Storage Systems
NASA Technical Reports Server (NTRS)
Menasce, Daniel A.; Pentakalos, Odysseas I.
1995-01-01
Network attached storage devices improve I/O performance by separating control and data paths and eliminating host intervention during the data transfer phase. Devices are attached to both a high speed network for data transfer and to a slower network for control messages. Hierarchical mass storage systems use disks to cache the most recently used files and a combination of robotic and manually mounted tapes to store the bulk of the files in the file system. This paper shows how queuing network models can be used to assess the performance of hierarchical mass storage systems that use network attached storage devices as opposed to host attached storage devices. Simulation was used to validate the model. The analytic model presented here can be used, among other things, to evaluate the protocols involved in 1/0 over network attached devices.
Adiabatic topological quantum computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Adiabatic topological quantum computing
Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...
2015-07-31
Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less
Barnett, G O; Famiglietti, K T; Kim, R J; Hoffer, E P; Feldman, M J
1998-01-01
DXplain, a computer-based medical education, reference and decision support system has been used by thousands of physicians and medical students on stand-alone systems and over communications networks. For the past two years, we have made DXplain available over the Internet in order to provide DXplain's knowledge and analytical capabilities as a resource to other applications within Massachusetts General Hospital (MGH) and at outside institutions. We describe and provide the user experience with two different protocols through which users can access DXplain through the World Wide Web (WWW). The first allows the user to have direct interaction with all the functionality of DXplain where the MGH server controls the interaction and the mode of presentation. In the second mode, the MGH server provides the DXplain functionality as a series of services, which can be called independently by the user application program.
Multicasting in Wireless Communications (Ad-Hoc Networks): Comparison against a Tree-Based Approach
NASA Astrophysics Data System (ADS)
Rizos, G. E.; Vasiliadis, D. C.
2007-12-01
We examine on-demand multicasting in ad hoc networks. The Core Assisted Mesh Protocol (CAMP) is a well-known protocol for multicast routing in ad-hoc networks, generalizing the notion of core-based trees employed for internet multicasting into multicast meshes that have much richer connectivity than trees. On the other hand, wireless tree-based multicast routing protocols use much simpler structures for determining route paths, using only parent-child relationships. In this work, we compare the performance of the CAMP protocol against the performance of wireless tree-based multicast routing protocols, in terms of two important factors, namely packet delay and ratio of dropped packets.
Mars Sample Handling Protocol Workshop Series: Workshop 4
NASA Technical Reports Server (NTRS)
Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)
2001-01-01
In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol, which was compiled from deliberations and recommendations from earlier Workshops in the Series, represents a consensus that emerged from the discussions of all the sub-groups assembled over the course of the five Workshops of the Series. These discussions converged on a conceptual approach to sample handling, as well as on specific analytical requirements. Discussions also identified important issues requiring attention, as well as research and development needed for protocol implementation.
A Routine Experimental Protocol for qHNMR Illustrated with Taxol⊥
Pauli, Guido F.; Jaki, Birgit U.; Lankin, David C.
2012-01-01
Quantitative 1H NMR (qHNMR) provides a value-added dimension to the standard spectroscopic data set involved in structure analysis, especially when analyzing bioactive molecules and elucidating new natural products. The qHNMR method can be integrated into any routine qualitative workflow without much additional effort by simply establishing quantitative conditions for the standard solution 1H NMR experiments. Moreover, examination of different chemical lots of taxol and a Taxus brevifolia extract as working examples led to a blueprint for a generic approach to performing a routinely practiced 13C-decoupled qHNMR experiment, and for recognizing its potential and main limitations. The proposed protocol is based on a newly assembled 13C GARP broadband decoupled proton acquisition sequence that reduces spectroscopic complexity by removal of carbon satellites. The method is capable of providing qualitative and quantitative NMR data simultaneously and covers various analytes from pure compounds to complex mixtures such as metabolomes. Due to a routinely achievable dynamic range of 300:1 (0.3%) or better, qHNMR qualifies for applications ranging from reference standards to biologically active compounds to metabolome analysis. Providing a “cookbook” approach to qHNMR, acquisition conditions are described that can be adapted for contemporary NMR spectrometers of all major manufacturers. PMID:17298095
Presson, Nora; Krishnaswamy, Deepa; Wagener, Lauren; Bird, William; Jarbo, Kevin; Pathak, Sudhir; Puccio, Ava M; Borasso, Allison; Benso, Steven; Okonkwo, David O; Schneider, Walter
2015-03-01
There is an urgent, unmet demand for definitive biological diagnosis of traumatic brain injury (TBI) to pinpoint the location and extent of damage. We have developed High-Definition Fiber Tracking, a 3 T magnetic resonance imaging-based diffusion spectrum imaging and tractography analysis protocol, to quantify axonal injury in military and civilian TBI patients. A novel analytical methodology quantified white matter integrity in patients with TBI and healthy controls. Forty-one subjects (23 TBI, 18 controls) were scanned with the High-Definition Fiber Tracking diffusion spectrum imaging protocol. After reconstruction, segmentation was used to isolate bilateral hemisphere homologues of eight major tracts. Integrity of segmented tracts was estimated by calculating homologue correlation and tract coverage. Both groups showed high correlations for all tracts. TBI patients showed reduced homologue correlation and tract spread and increased outlier count (correlations>2.32 SD below control mean). On average, 6.5% of tracts in the TBI group were outliers with substantial variability among patients. Number and summed deviation of outlying tracts correlated with initial Glasgow Coma Scale score and 6-month Glasgow Outcome Scale-Extended score. The correlation metric used here can detect heterogeneous damage affecting a low proportion of tracts, presenting a potential mechanism for advancing TBI diagnosis. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
An energy-efficient rate adaptive media access protocol (RA-MAC) for long-lived sensor networks.
Hu, Wen; Chen, Quanjun; Corke, Peter; O'Rourke, Damien
2010-01-01
We introduce an energy-efficient Rate Adaptive Media Access Control (RA-MAC) algorithm for long-lived Wireless Sensor Networks (WSNs). Previous research shows that the dynamic and lossy nature of wireless communications is one of the major challenges to reliable data delivery in WSNs. RA-MAC achieves high link reliability in such situations by dynamically trading off data rate for channel gain. The extra gain that can be achieved reduces the packet loss rate which contributes to reduced energy expenditure through a reduced numbers of retransmissions. We achieve this at the expense of raw bit rate which generally far exceeds the application's link requirement. To minimize communication energy consumption, RA-MAC selects the optimal data rate based on the estimated link quality at each data rate and an analytical model of the energy consumption. Our model shows how the selected data rate depends on different channel conditions in order to minimize energy consumption. We have implemented RA-MAC in TinyOS for an off-the-shelf sensor platform (the TinyNode) on top of a state-of-the-art WSN Media Access Control Protocol, SCP-MAC, and evaluated its performance by comparing our implementation with the original SCP-MAC using both simulation and experiment.
Gruber, Aurélia; Pacault, Mathilde; El Khattabi, Laila Allach; Vaucouleur, Nicolas; Orhant, Lucie; Bienvenu, Thierry; Girodon, Emmanuelle; Vidaud, Dominique; Leturcq, France; Costa, Catherine; Letourneur, Franck; Anselem, Olivia; Tsatsaris, Vassilis; Goffinet, François; Viot, Géraldine; Vidaud, Michel; Nectoux, Juliette
2018-04-25
To limit risks of miscarriages associated with invasive procedures of current prenatal diagnosis practice, we aim to develop a personalized medicine-based protocol for non-invasive prenatal diagnosis (NIPD) of monogenic disorders relying on the detection of paternally inherited mutations in maternal blood using droplet digital PCR (ddPCR). This study included four couples at risk of transmitting paternal neurofibromatosis type 1 (NF1) mutations and four couples at risk of transmitting compound heterozygous CFTR mutations. NIPD was performed between 8 and 15 weeks of gestation, in parallel to conventional invasive diagnosis. We designed specific hydrolysis probes to detect the paternal mutation and to assess the presence of cell-free fetal DNA by ddPCR. Analytical performances of each assay were determined from paternal sample, an then fetal genotype was inferred from maternal plasma sample. Presence or absence of the paternal mutant allele was correctly determined in all the studied plasma DNA samples. We report an NIPD protocol suitable for implementation in an experienced laboratory of molecular genetics. Our proof-of-principle results point out a high accuracy for early detection of paternal NF1 and CFTR mutations in cell-free DNA, and open new perspectives for extending the technology to NIPD of many other monogenic diseases.
Davis, James; Vaughan, D Huw; Stirling, David; Nei, Lembit; Compton, Richard G
2002-07-19
The exploitation of the Ni(III)/Ni(II) transition as a means of quantifying the concentration of nickel within industrial samples was assessed. The methodology relies upon the reagentless electrodeposition of Ni onto a glassy carbon electrode and the subsequent oxidative conversion of the metallic layer to Ni(III). The analytical signal is derived from a cathodic stripping protocol in which the reduction of the Ni(III) layer to Ni(II) is monitored through the use of square wave voltammetry. The procedure was refined through the introduction of an ultrasonic source which served to both enhance the deposition of nickel and to remove the nickel hydroxide layer that results from the measurement process. A well-defined stripping peak was observed at +0.7 V (vs. Agmid R:AgCl) with the response found to be linear over the range 50 nM to 1 muM (based on a 30 s deposition time). Other metal ions such as Cu(II), Mn(II), Cr(III), Pb(II), Cd(II), Zn(II), Fe(III) and Co(II) did not interfere with the response when present in hundred fold excess. The viability of the technique was evaluated through the determination of nickel within a commercial copper nickel alloy and validated through an independent comparison with a standard ICP-AES protocol.
EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith
Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less
EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases
Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith; ...
2017-11-06
Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Generalized Teleportation and Entanglement Recycling
NASA Astrophysics Data System (ADS)
Strelchuk, Sergii; Horodecki, Michał; Oppenheim, Jonathan
2013-01-01
We introduce new teleportation protocols which are generalizations of the original teleportation protocols that use the Pauli group and the port-based teleportation protocols, introduced by Hiroshima and Ishizaka, that use the symmetric permutation group. We derive sufficient conditions for a set of operations, which in general need not form a group, to give rise to a teleportation protocol and provide examples of such schemes. This generalization leads to protocols with novel properties and is needed to push forward new schemes of computation based on them. Port-based teleportation protocols and our generalizations use a large resource state consisting of N singlets to teleport only a single qubit state reliably. We provide two distinct protocols which recycle the resource state to teleport multiple states with error linearly increasing with their number. The first protocol consists of sequentially teleporting qubit states, and the second teleports them in a bulk.
Generalized teleportation and entanglement recycling.
Strelchuk, Sergii; Horodecki, Michał; Oppenheim, Jonathan
2013-01-04
We introduce new teleportation protocols which are generalizations of the original teleportation protocols that use the Pauli group and the port-based teleportation protocols, introduced by Hiroshima and Ishizaka, that use the symmetric permutation group. We derive sufficient conditions for a set of operations, which in general need not form a group, to give rise to a teleportation protocol and provide examples of such schemes. This generalization leads to protocols with novel properties and is needed to push forward new schemes of computation based on them. Port-based teleportation protocols and our generalizations use a large resource state consisting of N singlets to teleport only a single qubit state reliably. We provide two distinct protocols which recycle the resource state to teleport multiple states with error linearly increasing with their number. The first protocol consists of sequentially teleporting qubit states, and the second teleports them in a bulk.