Sample records for event-specific analytical methods

  1. Simplex and duplex event-specific analytical methods for functional biotech maize.

    PubMed

    Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young

    2009-08-26

    Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.

  2. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  3. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  4. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  5. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.

  6. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  7. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  8. A dimensionless approach for the runoff peak assessment: effects of the rainfall event structure

    NASA Astrophysics Data System (ADS)

    Gnecco, Ilaria; Palla, Anna; La Barbera, Paolo

    2018-02-01

    The present paper proposes a dimensionless analytical framework to investigate the impact of the rainfall event structure on the hydrograph peak. To this end a methodology to describe the rainfall event structure is proposed based on the similarity with the depth-duration-frequency (DDF) curves. The rainfall input consists of a constant hyetograph where all the possible outcomes in the sample space of the rainfall structures can be condensed. Soil abstractions are modelled using the Soil Conservation Service method and the instantaneous unit hydrograph theory is undertaken to determine the dimensionless form of the hydrograph; the two-parameter gamma distribution is selected to test the proposed methodology. The dimensionless approach is introduced in order to implement the analytical framework to any study case (i.e. natural catchment) for which the model assumptions are valid (i.e. linear causative and time-invariant system). A set of analytical expressions are derived in the case of a constant-intensity hyetograph to assess the maximum runoff peak with respect to a given rainfall event structure irrespective of the specific catchment (such as the return period associated with the reference rainfall event). Looking at the results, the curve of the maximum values of the runoff peak reveals a local minimum point corresponding to the design hyetograph derived according to the statistical DDF curve. A specific catchment application is discussed in order to point out the dimensionless procedure implications and to provide some numerical examples of the rainfall structures with respect to observed rainfall events; finally their effects on the hydrograph peak are examined.

  9. What are Segments in Google Analytics

    EPA Pesticide Factsheets

    Segments find all sessions that meet a specific condition. You can then apply this segment to any report in Google Analytics (GA). Segments are a way of identifying sessions and users while filters identify specific events, like pageviews.

  10. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    PubMed Central

    Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine

    2015-01-01

    Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402

  11. Application of Haddon's matrix in qualitative research methodology: an experience in burns epidemiology.

    PubMed

    Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza

    2012-01-01

    Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.

  12. CoinCalc-A new R package for quantifying simultaneities of event series

    NASA Astrophysics Data System (ADS)

    Siegmund, Jonatan F.; Siegmund, Nicole; Donner, Reik V.

    2017-01-01

    We present the new R package CoinCalc for performing event coincidence analysis (ECA), a novel statistical method to quantify the simultaneity of events contained in two series of observations, either as simultaneous or lagged coincidences within a user-specific temporal tolerance window. The package also provides different analytical as well as surrogate-based significance tests (valid under different assumptions about the nature of the observed event series) as well as an intuitive visualization of the identified coincidences. We demonstrate the usage of CoinCalc based on two typical geoscientific example problems addressing the relationship between meteorological extremes and plant phenology as well as that between soil properties and land cover.

  13. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  14. Determination of a Limited Scope Network's Lightning Detection Efficiency

    NASA Technical Reports Server (NTRS)

    Rompala, John T.; Blakeslee, R.

    2008-01-01

    This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.

  15. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  16. Combining functionalised nanoparticles and SERS for the detection of DNA relating to disease.

    PubMed

    Graham, Duncan; Stevenson, Ross; Thompson, David G; Barrett, Lee; Dalton, Colette; Faulds, Karen

    2011-01-01

    DNA functionalised nanoparticle probes offer new opportunities in analyte detection. Ultrasensitive, molecularly specific targeting of analytes is possible through the use of metallic nanoparticles and their ability to generate a surface enhanced Raman scattering (SERS) response. This is leading to a new range of diagnostic clinical probes based on SERS detection. Our approaches have shown how such probes can detect specific DNA sequences by using a biomolecular recognition event to 'turn on' a SERS response through a controlled assembly process of the DNA functionalised nanoparticles. Further, we have prepared DNA aptamer functionalised SERS probes and demonstrated how introduction of a protein target can change the aggregation state of the nanoparticles in a dose-dependant manner. These approaches are being used as methods to detect biomolecules that indicate a specific disease being present with a view to improving disease management.

  17. RCRA Facility investigation report for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Volume 5, Technical Memorandums 06-09A, 06-10A, and 06-12A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This report provides a detailed summary of the activities carried out to sample groundwater at Waste Area Grouping (WAG) 6. The analytical results for samples collected during Phase 1, Activity 2 of the WAG 6 Resource Conservation and Recovery Act Facility Investigation (RFI) are also presented. In addition, analytical results for Phase 1, activity sampling events for which data were not previously reported are included in this TM. A summary of the groundwater sampling activities of WAG 6, to date, are given in the Introduction. The Methodology section describes the sampling procedures and analytical parameters. Six attachments are included. Attachmentsmore » 1 and 2 provide analytical results for selected RFI groundwater samples and ORNL sampling event. Attachment 3 provides a summary of the contaminants detected in each well sampled for all sampling events conducted at WAG 6. Bechtel National Inc. (BNI)/IT Corporation Contract Laboratory (IT) RFI analytical methods and detection limits are given in Attachment 4. Attachment 5 provides the Oak Ridge National Laboratory (ORNL)/Analytical Chemistry Division (ACD) analytical methods and detection limits and Resource Conservation and Recovery Act (RCRA) quarterly compliance monitoring (1988--1989). Attachment 6 provides ORNL/ACD groundwater analytical methods and detection limits (for the 1990 RCRA semi-annual compliance monitoring).« less

  18. \\tLaboratory Environmental Sample Disposal Information Document - Companion to Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events (SAM) – Revision 5.0

    EPA Pesticide Factsheets

    Document is intended to provide general guidelines for use byEPA and EPA-contracted laboratories when disposing of samples and associated analytical waste following use of the analytical methods listed in SAM.

  19. Development of an algorithm for automatic detection and rating of squeak and rattle events

    NASA Astrophysics Data System (ADS)

    Chandrika, Unnikrishnan Kuttan; Kim, Jay H.

    2010-10-01

    A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.

  20. The recalculation of the original pulse produced by a partial discharge

    NASA Technical Reports Server (NTRS)

    Tanasescu, F.

    1978-01-01

    The loads on a dielectric or an insulation arrangement cannot be precisely rated without properly assessing the manner in which a pulse produced by a partial discharge is transmitted from the point of the event to the point where it is recorded. A number of analytical and graphic methods are presented, and computer simulations are used for specific cases of a few measurement circuits. It turns out to be possible to determine the effect of each circuit element and thus make some valid corrections.

  1. An investigation into the two-stage meta-analytic copula modelling approach for evaluating time-to-event surrogate endpoints which comprise of one or more events of interest.

    PubMed

    Dimier, Natalie; Todd, Susan

    2017-09-01

    Clinical trials of experimental treatments must be designed with primary endpoints that directly measure clinical benefit for patients. In many disease areas, the recognised gold standard primary endpoint can take many years to mature, leading to challenges in the conduct and quality of clinical studies. There is increasing interest in using shorter-term surrogate endpoints as substitutes for costly long-term clinical trial endpoints; such surrogates need to be selected according to biological plausibility, as well as the ability to reliably predict the unobserved treatment effect on the long-term endpoint. A number of statistical methods to evaluate this prediction have been proposed; this paper uses a simulation study to explore one such method in the context of time-to-event surrogates for a time-to-event true endpoint. This two-stage meta-analytic copula method has been extensively studied for time-to-event surrogate endpoints with one event of interest, but thus far has not been explored for the assessment of surrogates which have multiple events of interest, such as those incorporating information directly from the true clinical endpoint. We assess the sensitivity of the method to various factors including strength of association between endpoints, the quantity of data available, and the effect of censoring. In particular, we consider scenarios where there exist very little data on which to assess surrogacy. Results show that the two-stage meta-analytic copula method performs well under certain circumstances and could be considered useful in practice, but demonstrates limitations that may prevent universal use. Copyright © 2017 John Wiley & Sons, Ltd.

  2. International development of methods of analysis for the presence of products of modern biotechnology.

    PubMed

    Cantrill, Richard C

    2008-01-01

    Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.

  3. Systematic review of methods used in meta-analyses where a primary outcome is an adverse or unintended event

    PubMed Central

    2012-01-01

    Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987

  4. Cross-Disciplinary Consultancy to Bridge Public Health Technical Needs and Analytic Developers: Asyndromic Surveillance Use Case

    PubMed Central

    Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura

    2015-01-01

    Introduction: We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists’ use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Materials and Methods: Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy’s focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Results: Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. Practice Implications: A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead. PMID:26834939

  5. Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events

    USDA-ARS?s Scientific Manuscript database

    Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....

  6. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine background site (Sonnblick Observatory, Saharan Dust Event from April 2016). Further, chemical imaging of biological aerosol constituents of an autumnal pollen breakout in Vienna, with background samples from nearby locations from November 2016 will demonstrate the advantages of the chemical imaging approach. Additionally, the chemical fingerprint of an exceptional air pollution event from a local emission source, caused by the pull down process of a building in Vienna will unravel the needs for multisensor imaging, especially the combinational access. Obtained chemical images will be correlated to bulk analytical results. Benefits of the overall methodical access by combining bulk analytics and combined chemical imaging of exceptional episodic air pollution events will be discussed.

  7. Detection and identification of genetically modified EE-1 brinjal (Solanum melongena) by single, multiplex and SYBR® real-time PCR.

    PubMed

    Ballari, Rajashekhar V; Martin, Asha; Gowda, Lalitha R

    2013-01-01

    Brinjal is an important vegetable crop. Major crop loss of brinjal is due to insect attack. Insect-resistant EE-1 brinjal has been developed and is awaiting approval for commercial release. Consumer health concerns and implementation of international labelling legislation demand reliable analytical detection methods for genetically modified (GM) varieties. End-point and real-time polymerase chain reaction (PCR) methods were used to detect EE-1 brinjal. In end-point PCR, primer pairs specific to 35S CaMV promoter, NOS terminator and nptII gene common to other GM crops were used. Based on the revealed 3' transgene integration sequence, primers specific for the event EE-1 brinjal were designed. These primers were used for end-point single, multiplex and SYBR-based real-time PCR. End-point single PCR showed that the designed primers were highly specific to event EE-1 with a sensitivity of 20 pg of genomic DNA, corresponding to 20 copies of haploid EE-1 brinjal genomic DNA. The limits of detection and quantification for SYBR-based real-time PCR assay were 10 and 100 copies respectively. The prior development of detection methods for this important vegetable crop will facilitate compliance with any forthcoming labelling regulations. Copyright © 2012 Society of Chemical Industry.

  8. On the nature of the fragment environment created by the range destruction or random failure of solid rocket motor casings

    NASA Technical Reports Server (NTRS)

    Eck, M.; Mukunda, M.

    1988-01-01

    Given here are predictions of fragment velocities and azimuths resulting from the Space Transportation System Solid Rocket Motor range destruct, or random failure occurring at any time during the 120 seconds of Solid Rocket Motor burn. Results obtained using the analytical methods described showed good agreement between predictions and observations for two specific events. It was shown that these methods have good potential for use in predicting the fragmentation process of a number of generically similar casing systems. It was concluded that coupled Eulerian-Lagrangian calculational methods of the type described here provide a powerful tool for predicting Solid Rocket Motor response.

  9. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  10. Electrochemical detection of a single cytomegalovirus at an ultramicroelectrode and its antibody anchoring

    PubMed Central

    Dick, Jeffrey E.; Hilterbrand, Adam T.; Boika, Aliaksei; Upton, Jason W.; Bard, Allen J.

    2015-01-01

    We report observations of stochastic collisions of murine cytomegalovirus (MCMV) on ultramicroelectrodes (UMEs), extending the observation of discrete collision events on UMEs to biologically relevant analytes. Adsorption of an antibody specific for a virion surface glycoprotein allowed differentiation of MCMV from MCMV bound by antibody from the collision frequency decrease and current magnitudes in the electrochemical collision experiments, which shows the efficacy of the method to size viral samples. To add selectivity to the technique, interactions between MCMV, a glycoprotein-specific primary antibody to MCMV, and polystyrene bead “anchors,” which were functionalized with a secondary antibody specific to the Fc region of the primary antibody, were used to affect virus mobility. Bead aggregation was observed, and the extent of aggregation was measured using the electrochemical collision technique. Scanning electron microscopy and optical microscopy further supported aggregate shape and extent of aggregation with and without MCMV. This work extends the field of collisions to biologically relevant antigens and provides a novel foundation upon which qualitative sensor technology might be built for selective detection of viruses and other biologically relevant analytes. PMID:25870261

  11. Validation of surrogate endpoints in cancer clinical trials via principal stratification with an application to a prostate cancer trial.

    PubMed

    Tanaka, Shiro; Matsuyama, Yutaka; Ohashi, Yasuo

    2017-08-30

    Increasing attention has been focused on the use and validation of surrogate endpoints in cancer clinical trials. Previous literature on validation of surrogate endpoints are classified into four approaches: the proportion explained approach; the indirect effects approach; the meta-analytic approach; and the principal stratification approach. The mainstream in cancer research has seen the application of a meta-analytic approach. However, VanderWeele (2013) showed that all four of these approaches potentially suffer from the surrogate paradox. It was also shown that, if a principal surrogate satisfies additional criteria called one-sided average causal sufficiency, the surrogate cannot exhibit a surrogate paradox. Here, we propose a method for estimating principal effects under a monotonicity assumption. Specifically, we consider cancer clinical trials which compare a binary surrogate endpoint and a time-to-event clinical endpoint under two naturally ordered treatments (e.g. combined therapy vs. monotherapy). Estimation based on a mean score estimating equation will be implemented by the expectation-maximization algorithm. We will also apply the proposed method as well as other surrogacy criteria to evaluate the surrogacy of prostate-specific antigen using data from a phase III advanced prostate cancer trial, clarifying the complementary roles of both the principal stratification and meta-analytic approaches in the evaluation of surrogate endpoints in cancer. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  13. PCR technology for screening and quantification of genetically modified organisms (GMOs).

    PubMed

    Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G

    2003-04-01

    Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.

  14. Method for detection of selected chemicals in an open environment

    NASA Technical Reports Server (NTRS)

    Duong, Tuan (Inventor); Ryan, Margaret (Inventor)

    2009-01-01

    The present invention relates to a space-invariant independent component analysis and electronic nose for detection of selective chemicals in an unknown environment, and more specifically, an approach to analysis of sensor responses to mixtures of unknown chemicals by an electronic nose in an open and changing environment. It is intended to fill the gap between an alarm, which has little or no ability to distinguish among chemical compounds causing a response, and an analytical instrument, which can distinguish all compounds present but with no real-time or continuous event monitoring ability.

  15. A validation of event-related FMRI comparisons between users of cocaine, nicotine, or cannabis and control subjects.

    PubMed

    Murphy, Kevin; Dixon, Veronica; LaGrave, Kathleen; Kaufman, Jacqueline; Risinger, Robert; Bloom, Alan; Garavan, Hugh

    2006-07-01

    Noninvasive brain imaging techniques are a powerful tool for researching the effects of drug abuse on brain activation measures. However, because many drugs have direct vascular effects, the validity of techniques that depend on blood flow measures as a reflection of neuronal activity may be called into question. This may be of particular concern in event-related functional magnetic resonance imaging (fMRI), where current analytic techniques search for a specific shape in the hemodynamic response to neuronal activity. To investigate possible alterations in task-related activation as a result of drug abuse, fMRI scans were conducted on subjects in four groups as they performed a simple event-related finger-tapping task: users of cocaine, nicotine, or cannabis and control subjects. Activation measures, as determined by two different analytic methods, did not differ between the groups. A comparison between an intravenous saline and an intravenous cocaine condition in cocaine users found a similar null result. Further in-depth analyses of the shape of the hemodynamic responses in each group also showed no differences. This study demonstrates that drug groups may be compared with control subjects using event-related fMRI without the need for any post hoc procedures to correct for possible drug-induced cardiovascular alterations. Thus, fMRI activation differences reported between these drug groups can be more confidently interpreted as reflecting neuronal differences.

  16. Post-seismic relaxation theory on laterally heterogeneous viscoelastic model

    USGS Publications Warehouse

    Pollitz, F.F.

    2003-01-01

    Investigation was carried out into the problem of relaxation of a laterally heterogeneous viscoelastic Earth following an impulsive moment release event. The formal solution utilizes a semi-analytic solution for post-seismic deformation on a laterally homogeneous Earth constructed from viscoelastic normal modes, followed by application of mode coupling theory to derive the response on the aspherical Earth. The solution is constructed in the Laplace transform domain using the correspondence principle and is valid for any linear constitutive relationship between stress and strain. The specific implementation described in this paper is a semi-analytic discretization method which assumes isotropic elastic structure and a Maxwell constitutive relation. It accounts for viscoelastic-gravitational coupling under lateral variations in elastic parameters and viscosity. For a given viscoelastic structure and minimum wavelength scale, the computational effort involved with the numerical algorithm is proportional to the volume of the laterally heterogeneous region. Examples are presented of the calculation of post-seismic relaxation with a shallow, laterally heterogeneous volume following synthetic impulsive seismic events, and they illustrate the potentially large effect of regional 3-D heterogeneities on regional deformation patterns.

  17. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System

    PubMed Central

    Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-01-01

    Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271

  18. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.

    PubMed

    Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-12-01

    Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.

  19. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  20. A High Throughput Method for Measuring Polycyclic Aromatic Hydrocarbons in Seafood Using QuEChERS Extraction and SBSE.

    PubMed

    Pfannkoch, Edward A; Stuff, John R; Whitecavage, Jacqueline A; Blevins, John M; Seely, Kathryn A; Moran, Jeffery H

    2015-01-01

    National Oceanic and Atmospheric Administration (NOAA) Method NMFS-NWFSC-59 2004 is currently used to quantitatively analyze seafood for polycyclic aromatic hydrocarbon (PAH) contamination, especially following events such as the Deepwater Horizon oil rig explosion that released millions of barrels of crude oil into the Gulf of Mexico. This method has limited throughput capacity; hence, alternative methods are necessary to meet analytical demands after such events. Stir bar sorptive extraction (SBSE) is an effective technique to extract trace PAHs in water and the quick, easy, cheap, effective, rugged, and safe (QuEChERS) extraction strategy effectively extracts PAHs from complex food matrices. This study uses SBSE to concentrate PAHs and eliminate matrix interference from QuEChERS extracts of seafood, specifically oysters, fish, and shrimp. This method provides acceptable recovery (65-138%) linear calibrations and is sensitive (LOD = 0.02 ppb, LOQ = 0.06 ppb) while providing higher throughput and maintaining equivalency between NOAA 2004 as determined by analysis of NIST SRM 1974b mussel tissue.

  1. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  2. Embedded security system for multi-modal surveillance in a railway carriage

    NASA Astrophysics Data System (ADS)

    Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry

    2015-10-01

    Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.

  3. Peptide interfaces with graphene: an emerging intersection of analytical chemistry, theory, and materials.

    PubMed

    Russell, Shane R; Claridge, Shelley A

    2016-04-01

    Because noncovalent interface functionalization is frequently required in graphene-based devices, biomolecular self-assembly has begun to emerge as a route for controlling substrate electronic structure or binding specificity for soluble analytes. The remarkable diversity of structures that arise in biological self-assembly hints at the possibility of equally diverse and well-controlled surface chemistry at graphene interfaces. However, predicting and analyzing adsorbed monolayer structures at such interfaces raises substantial experimental and theoretical challenges. In contrast with the relatively well-developed monolayer chemistry and characterization methods applied at coinage metal surfaces, monolayers on graphene are both less robust and more structurally complex, levying more stringent requirements on characterization techniques. Theory presents opportunities to understand early binding events that lay the groundwork for full monolayer structure. However, predicting interactions between complex biomolecules, solvent, and substrate is necessitating a suite of new force fields and algorithms to assess likely binding configurations, solvent effects, and modulations to substrate electronic properties. This article briefly discusses emerging analytical and theoretical methods used to develop a rigorous chemical understanding of the self-assembly of peptide-graphene interfaces and prospects for future advances in the field.

  4. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for the presence of E. coli, enterococci, or coliphage: Analytical Methods for Source Water Monitoring... Microbiology, 62:3881-3884. 10 EPA Method 1601: Male-specific (F+) and Somatic Coliphage in Water by Two-step... 20460. 11 EPA Method 1602: Male-specific (F+) and Somatic Coliphage in Water by Single Agar Layer (SAL...

  5. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for the presence of E. coli, enterococci, or coliphage: Analytical Methods for Source Water Monitoring... Microbiology, 62:3881-3884. 10 EPA Method 1601: Male-specific (F+) and Somatic Coliphage in Water by Two-step... 20460. 11 EPA Method 1602: Male-specific (F+) and Somatic Coliphage in Water by Single Agar Layer (SAL...

  6. Complementary Expertise in a Zoo Educator Professional Development Event Contributes to the Construction of Understandings of Affective Transformation

    ERIC Educational Resources Information Center

    Kelly, Lisa-Anne DeGregoria; Kassing, Sharon

    2013-01-01

    Cultural Historical Activity Theory served as the analytical framework for the study of a professional development event for a zoo's education department, specifically designed to build understandings of "Affective Transformation," an element pertinent to the organization's strategic plan. Three key products--an Affective…

  7. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance.

    PubMed

    Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve

    2010-10-01

    This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Cross-Disciplinary Consultancy to Bridge Public Health Technical Needs and Analytic Developers: Asyndromic Surveillance Use Case.

    PubMed

    Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura; Burkom, Howard

    2015-01-01

    We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists' use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy's focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead.

  9. Molecular toolbox for the identification of unknown genetically modified organisms.

    PubMed

    Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc

    2010-03-01

    Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.

  10. PESTICIDE ANALYTICAL METHODS TO SUPPORT DUPLICATE-DIET HUMAN EXPOSURE MEASUREMENTS

    EPA Science Inventory

    Historically, analytical methods for determination of pesticides in foods have been developed in support of regulatory programs and are specific to food items or food groups. Most of the available methods have been developed, tested and validated for relatively few analytes an...

  11. Detection and quantification of genetically modified organisms using very short, locked nucleic acid TaqMan probes.

    PubMed

    Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio

    2008-06-25

    Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.

  12. IMMUNOCHEMICAL APPLICATIONS IN ENVIRONMENTAL SCIENCE

    EPA Science Inventory

    Immunochemical methods are based on selective antibodies combining with a particular target analyte or analyte group. The specific binding between antibody and analyte can be used to detect environmental contaminants in a variety of sample matrixes. Immunoassay methods provide ...

  13. SociAL Sensor Analytics: Measuring Phenomenology at Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.

    The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less

  14. A field study of selected U.S. Geological Survey analytical methods for measuring pesticides in filtered stream water, June - September 2012

    USGS Publications Warehouse

    Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.

    2017-09-06

    U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.

  15. 42 CFR 493.845 - Standard; Toxicology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  16. 42 CFR 493.851 - Standard; Hematology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  17. 42 CFR 493.843 - Standard; Endocrinology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  18. 42 CFR 493.845 - Standard; Toxicology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  19. 42 CFR 493.845 - Standard; Toxicology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  20. 42 CFR 493.851 - Standard; Hematology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  1. 42 CFR 493.843 - Standard; Endocrinology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  2. 42 CFR 493.843 - Standard; Endocrinology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  3. 42 CFR 493.851 - Standard; Hematology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... acceptable responses for each analyte in each testing event is unsatisfactory analyte performance for the... testing event. (e)(1) For any unsatisfactory analyte or test performance or testing event for reasons... any unacceptable analyte or testing event score, remedial action must be taken and documented, and the...

  4. Modeling Heterogeneity in Momentary Interpersonal and Affective Dynamic Processes in Borderline Personality Disorder

    PubMed Central

    Wright, Aidan G. C.; Hallquist, Michael N.; Stepp, Stephanie D.; Scott, Lori N.; Beeney, Joseph E.; Lazarus, Sophie A.; Pilkonis, Paul A.

    2016-01-01

    Borderline personality disorder (BPD) is a diagnosis defined by impairments in several dynamic processes (e.g., interpersonal relating, affect regulation, behavioral control). Theories of BPD emphasize that these impairments appear in specific contexts, and emerging results confirm this view. At the same time, BPD is a complex construct that encompasses individuals with heterogeneous pathology. These features—dynamic processes, situational specificity, and individual heterogeneity—pose significant assessment challenges. In the current study, we demonstrate assessment and analytic methods that capture both between-person differences and within-person changes over time. Twenty-five participants diagnosed with BPD completed event-contingent, ambulatory assessment protocols over 21 days. We used p-technique factor analyses to identify person-specific psychological structures consistent with clinical theories of personality. Five exemplar cases are selected and presented in detail to showcase the potential utility of these methods. The presented cases' factor structures reflect not only heterogeneity but also suggest points of convergence. The factors also demonstrated significant associations with important clinical targets (self-harm, interpersonal violence). PMID:27317561

  5. Development and validation of a multiplex real-time PCR method to simultaneously detect 47 targets for the identification of genetically modified organisms.

    PubMed

    Cottenet, Geoffrey; Blancpain, Carine; Sonnard, Véronique; Chuah, Poh Fong

    2013-08-01

    Considering the increase of the total cultivated land area dedicated to genetically modified organisms (GMO), the consumers' perception toward GMO and the need to comply with various local GMO legislations, efficient and accurate analytical methods are needed for their detection and identification. Considered as the gold standard for GMO analysis, the real-time polymerase chain reaction (RTi-PCR) technology was optimised to produce a high-throughput GMO screening method. Based on simultaneous 24 multiplex RTi-PCR running on a ready-to-use 384-well plate, this new procedure allows the detection and identification of 47 targets on seven samples in duplicate. To comply with GMO analytical quality requirements, a negative and a positive control were analysed in parallel. In addition, an internal positive control was also included in each reaction well for the detection of potential PCR inhibition. Tested on non-GM materials, on different GM events and on proficiency test samples, the method offered high specificity and sensitivity with an absolute limit of detection between 1 and 16 copies depending on the target. Easy to use, fast and cost efficient, this multiplex approach fits the purpose of GMO testing laboratories.

  6. CREME96 and Related Error Rate Prediction Methods

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    2012-01-01

    Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.

  7. The spatial distribution patterns of condensed phase post-blast explosive residues formed during detonation.

    PubMed

    Abdul-Karim, Nadia; Blackman, Christopher S; Gill, Philip P; Karu, Kersti

    2016-10-05

    The continued usage of explosive devices, as well as the ever growing threat of 'dirty' bombs necessitates a comprehensive understanding of particle dispersal during detonation events in order to develop effectual methods for targeting explosive and/or additive remediation efforts. Herein, the distribution of explosive analytes from controlled detonations of aluminised ammonium nitrate and an RDX-based explosive composition were established by systematically sampling sites positioned around each firing. This is the first experimental study to produce evidence that the post-blast residue mass can distribute according to an approximate inverse-square law model, while also demonstrating for the first time that distribution trends can vary depending on individual analytes. Furthermore, by incorporating blast-wave overpressure measurements, high-speed imaging for fireball volume recordings, and monitoring of environmental conditions, it was determined that the principle factor affecting all analyte dispersals was the wind direction, with other factors affecting specific analytes to varying degrees. The dispersal mechanism for explosive residue is primarily the smoke cloud, a finding which in itself has wider impacts on the environment and fundamental detonation theory. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel Boulos, Maged; Sanfilippo, Antonio P.; Corley, Courtney D.

    2010-03-17

    This paper explores techno-social predictive analytics (TPA) and related methods for Web “data mining” where users’ posts and queries are garnered from Social Web (“Web 2.0”) tools such as blogs, microblogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people’s collective health status of whole populations. Several health related tool examples are described and demonstrated as practicalmore » means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks.« less

  9. Quantitative determination of carcinogenic mycotoxins in human and animal biological matrices and animal-derived foods using multi-mycotoxin and analyte-specific high performance liquid chromatography-tandem mass spectrometric methods.

    PubMed

    Cao, Xiaoqin; Li, Xiaofei; Li, Jian; Niu, Yunhui; Shi, Lu; Fang, Zhenfeng; Zhang, Tao; Ding, Hong

    2018-01-15

    A sensitive and reliable multi-mycotoxin-based method was developed to identify and quantify several carcinogenic mycotoxins in human blood and urine, as well as edible animal tissues, including muscle and liver tissue from swine and chickens, using liquid chromatography-tandem mass spectrometry (LC-MS/MS). For the toxicokinetic studies with individual mycotoxins, highly sensitive analyte-specific LC-MS/MS methods were developed for rat plasma and urine. Sample purification consisted of a rapid 'dilute and shoot' approach in urine samples, a simple 'dilute, evaporate and shoot' approach in plasma samples and a 'QuEChERS' procedure in edible animal tissues. The multi-mycotoxin and analyte-specific methods were validated in-house: The limits of detection (LOD) for the multi-mycotoxin and analyte-specific methods ranged from 0.02 to 0.41 μg/kg (μg/L) and 0.01 to 0.19 μg/L, respectively, and limits of quantification (LOQ) between 0.10 to 1.02 μg/kg (μg/L) and 0.09 to 0.47 μg/L, respectively. Apparent recoveries of the samples spiked with 0.25 to 4 μg/kg (μg/L) ranged from 60.1% to 109.8% with relative standard deviations below 15%. The methods were successfully applied to real samples. To the best of our knowledge, this is the first study carried out using a small group of patients from the Chinese population with hepatocellular carcinoma to assess their exposure to carcinogenic mycotoxins using biomarkers. Finally, the multi-mycotoxin method is a useful analytical method for assessing exposure to mycotoxins edible in animal tissues. The analyte-specific methods could be useful during toxicokinetic and toxicological studies. Copyright © 2017. Published by Elsevier B.V.

  10. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    PubMed

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient evaluation.

  11. Can machine learning complement traditional medical device surveillance? A case study of dual-chamber implantable cardioverter-defibrillators.

    PubMed

    Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M

    2017-01-01

    Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.

  12. From Regional to National Clouds: TV Coverage in the Czech Republic

    PubMed Central

    Sucháček, Jan; Sed’a, Petr; Friedrich, Václav; Wachowiak-Smolíková, Renata; Wachowiak, Mark P.

    2016-01-01

    Media, and particularly TV media, have a great impact on the general public. In recent years, spatial patterns of information and the relevance of intangible geographies have become increasingly important. Gatekeeping plays a critical role in the selection of information that is transformed into media. Therefore, gatekeeping, through national media, also co-forms the generation of mental maps. In this paper, correspondence analysis (a statistical method) combined with cloud lines (a new visual analytics technique) is used to analyze how individual major regional events in one of the post-communist countries, the Czech Republic, penetrate into the media on a national scale. Although national news should minimize distortions about regions, this assumption has not been verified by our research. Impressions presented by the media of selected regions that were markedly influenced by one or several events in those regions demonstrate that gatekeepers, especially news reporters, functioned as a filter by selecting only a few specific, and in many cases, unusual events for dissemination. PMID:27824947

  13. Method for reduction of selected ion intensities in confined ion beams

    DOEpatents

    Eiden, Gregory C.; Barinaga, Charles J.; Koppenaal, David W.

    1998-01-01

    A method for producing an ion beam having an increased proportion of analyte ions compared to carrier gas ions is disclosed. Specifically, the method has the step of addition of a charge transfer gas to the carrier analyte combination that accepts charge from the carrier gas ions yet minimally accepts charge from the analyte ions thereby selectively neutralizing the carrier gas ions. Also disclosed is the method as employed in various analytical instruments including an inductively coupled plasma mass spectrometer.

  14. Method for reduction of selected ion intensities in confined ion beams

    DOEpatents

    Eiden, G.C.; Barinaga, C.J.; Koppenaal, D.W.

    1998-06-16

    A method for producing an ion beam having an increased proportion of analyte ions compared to carrier gas ions is disclosed. Specifically, the method has the step of addition of a charge transfer gas to the carrier analyte combination that accepts charge from the carrier gas ions yet minimally accepts charge from the analyte ions thereby selectively neutralizing the carrier gas ions. Also disclosed is the method as employed in various analytical instruments including an inductively coupled plasma mass spectrometer. 7 figs.

  15. Recent Advances in Macrocyclic Fluorescent Probes for Ion Sensing.

    PubMed

    Wong, Joseph K-H; Todd, Matthew H; Rutledge, Peter J

    2017-01-25

    Small-molecule fluorescent probes play a myriad of important roles in chemical sensing. Many such systems incorporating a receptor component designed to recognise and bind a specific analyte, and a reporter or transducer component which signals the binding event with a change in fluorescence output have been developed. Fluorescent probes use a variety of mechanisms to transmit the binding event to the reporter unit, including photoinduced electron transfer (PET), charge transfer (CT), Förster resonance energy transfer (FRET), excimer formation, and aggregation induced emission (AIE) or aggregation caused quenching (ACQ). These systems respond to a wide array of potential analytes including protons, metal cations, anions, carbohydrates, and other biomolecules. This review surveys important new fluorescence-based probes for these and other analytes that have been reported over the past five years, focusing on the most widely exploited macrocyclic recognition components, those based on cyclam, calixarenes, cyclodextrins and crown ethers; other macrocyclic and non-macrocyclic receptors are also discussed.

  16. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, Lawrence M.

    1990-01-01

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4-20 amino acids for specific affinity to the analyte.

  17. Quantitative control of poly(ethylene oxide) surface antifouling and biodetection through azimuthally enhanced grating coupled-surface plasmon resonance sensing

    NASA Astrophysics Data System (ADS)

    Sonato, Agnese; Silvestri, Davide; Ruffato, Gianluca; Zacco, Gabriele; Romanato, Filippo; Morpurgo, Margherita

    2013-12-01

    Grating Coupled-Surface Plasmon reflectivity measurements carried out under azimuth and polarization control (GC-SPR φ ≠ 0°) were used to optimize the process of gold surface dressing with poly(ethylene oxide) (PEO) derivatives of different molecular weight, with the final goal to maximize the discrimination between specific and non-specific binding events occurring at the surface. The kinetics of surface deposition of thiol-ending PEOs (0.3, 2 and 5 kDa), introduced as antifouling layers, was monitored. Non-specific binding events upon immersion of the surfaces into buffers containing either 0.1% bovine serum albumin or 1% Goat Serum, were evaluated as a function of polymer size and density. A biorecognition event between avidin and biotin was then monitored in both buffers at selected low and high polymer surface densities and the contribution of analyte and fouling elements to the signal was precisely quantified. The 0.3 kDa PEO film was unable to protect the surface from non-specific interactions at any tested density. On the other hand, the 2 and 5 kDa polymers at their highest surface densities guaranteed full protection from non-specific interactions from both buffers. These densities were reached upon a long deposition time (24-30 h). The results pave the way toward the application of this platform for the detection of low concentration and small dimension analytes, for which both non-fouling and high instrumental sensitivity are fundamental requirements.

  18. Validation of Rapid Radiochemical Method for Californium ...

    EPA Pesticide Factsheets

    Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.

  19. An analytical approach for estimating fossil record and diversification events in sharks, skates and rays.

    PubMed

    Guinot, Guillaume; Adnet, Sylvain; Cappetta, Henri

    2012-01-01

    Modern selachians and their supposed sister group (hybodont sharks) have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders), the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops), thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction) are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.

  20. [Detection of rubella virus RNA in clinical material by real time polymerase chain reaction method].

    PubMed

    Domonova, É A; Shipulina, O Iu; Kuevda, D A; Larichev, V F; Safonova, A P; Burchik, M A; Butenko, A M; Shipulin, G A

    2012-01-01

    Development of a reagent kit for detection of rubella virus RNA in clinical material by PCR-RT. During development and determination of analytical specificity and sensitivity DNA and RNA of 33 different microorganisms including 4 rubella strains were used. Comparison of analytical sensitivity of virological and molecular-biological methods was performed by using rubella virus strains Wistar RA 27/3, M-33, "Orlov", Judith. Evaluation of diagnostic informativity of rubella virus RNAisolation in various clinical material by PCR-RT method was performed in comparison with determination of virus specific serum antibodies by enzyme immunoassay. A reagent kit for the detection of rubella virus RNA in clinical material by PCR-RT was developed. Analytical specificity was 100%, analytical sensitivity - 400 virus RNA copies per ml. Analytical sensitivity of the developed technique exceeds analytical sensitivity of the Vero E6 cell culture infection method in studies of rubella virus strains Wistar RA 27/3 and "Orlov" by 11g and 31g, and for M-33 and Judith strains is analogous. Diagnostic specificity is 100%. Diagnostic specificity for testing samples obtained within 5 days of rash onset: for peripheral blood sera - 20.9%, saliva - 92.5%, nasopharyngeal swabs - 70.1%, saliva and nasopharyngeal swabs - 97%. Positive and negative predictive values of the results were shown depending on the type of clinical material tested. Application of reagent kit will allow to increase rubella diagnostics effectiveness at the early stages of infectious process development, timely and qualitatively perform differential diagnostics of exanthema diseases, support tactics of anti-epidemic regime.

  1. Safety and Waste Management for SAM Chemistry Methods

    EPA Pesticide Factsheets

    The General Safety and Waste Management page offers section-specific safety and waste management details for the chemical analytes included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  2. Safety and Waste Management for SAM Radiochemical Methods

    EPA Pesticide Factsheets

    The General Safety and Waste Management page offers section-specific safety and waste management details for the radiochemical analytes included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  3. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  4. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  5. Selection of Suitable DNA Extraction Methods for Genetically Modified Maize 3272, and Development and Evaluation of an Event-Specific Quantitative PCR Method for 3272.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.

  6. Method of identity analyte-binding peptides

    DOEpatents

    Kauvar, L.M.

    1990-10-16

    A method for affinity chromatography or adsorption of a designated analyte utilizes a paralog as the affinity partner. The immobilized paralog can be used in purification or analysis of the analyte; the paralog can also be used as a substitute for antibody in an immunoassay. The paralog is identified by screening candidate peptide sequences of 4--20 amino acids for specific affinity to the analyte. 5 figs.

  7. Solvent effects in time-dependent self-consistent field methods. II. Variational formulations and analytical gradients

    DOE PAGES

    Bjorgaard, J. A.; Velizhanin, K. A.; Tretiak, S.

    2015-08-06

    This study describes variational energy expressions and analytical excited state energy gradients for time-dependent self-consistent field methods with polarizable solvent effects. Linear response, vertical excitation, and state-specific solventmodels are examined. Enforcing a variational ground stateenergy expression in the state-specific model is found to reduce it to the vertical excitation model. Variational excited state energy expressions are then provided for the linear response and vertical excitation models and analytical gradients are formulated. Using semiempiricalmodel chemistry, the variational expressions are verified by numerical and analytical differentiation with respect to a static external electric field. Lastly, analytical gradients are further tested by performingmore » microcanonical excited state molecular dynamics with p-nitroaniline.« less

  8. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    PubMed

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  9. Application of capability indices and control charts in the analytical method control strategy.

    PubMed

    Oliva, Alexis; Llabres Martinez, Matías

    2017-08-01

    In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm  = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  11. ELEGANT ENVIRONMENTAL IMMUNOASSAYS

    EPA Science Inventory

    Immunochemical methods are based on selective antibodies directed to a particular target analyte. The specific binding between antibody and analyte can be used for detection and quantitation. Methods such as the enzyme-linked immunosorbent assay (ELISA) can provide a sensitiv...

  12. Multiple Time-of-Flight/Time-of-Flight Events in a Single Laser Shot for Improved Matrix-Assisted Laser Desorption/Ionization Tandem Mass Spectrometry Quantification.

    PubMed

    Prentice, Boone M; Chumbley, Chad W; Hachey, Brian C; Norris, Jeremy L; Caprioli, Richard M

    2016-10-04

    Quantitative matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) approaches have historically suffered from poor accuracy and precision mainly due to the nonuniform distribution of matrix and analyte across the target surface, matrix interferences, and ionization suppression. Tandem mass spectrometry (MS/MS) can be used to ensure chemical specificity as well as improve signal-to-noise ratios by eliminating interferences from chemical noise, alleviating some concerns about dynamic range. However, conventional MALDI TOF/TOF modalities typically only scan for a single MS/MS event per laser shot, and multiplex assays require sequential analyses. We describe here new methodology that allows for multiple TOF/TOF fragmentation events to be performed in a single laser shot. This technology allows the reference of analyte intensity to that of the internal standard in each laser shot, even when the analyte and internal standard are quite disparate in m/z, thereby improving quantification while maintaining chemical specificity and duty cycle. In the quantitative analysis of the drug enalapril in pooled human plasma with ramipril as an internal standard, a greater than 4-fold improvement in relative standard deviation (<10%) was observed as well as improved coefficients of determination (R 2 ) and accuracy (>85% quality controls). Using this approach we have also performed simultaneous quantitative analysis of three drugs (promethazine, enalapril, and verapamil) using deuterated analogues of these drugs as internal standards.

  13. Predicting the velocity and azimuth of fragments generated by the range destruction or random failure of rocket casings and tankage

    NASA Technical Reports Server (NTRS)

    Eck, Marshall; Mukunda, Meera

    1988-01-01

    A calculational method is described which provides a powerful tool for predicting solid rocket motor (SRM) casing and liquid rocket tankage fragmentation response. The approach properly partitions the available impulse to each major system-mass component. It uses the Pisces code developed by Physics International to couple the forces generated by an Eulerian-modeled gas flow field to a Lagrangian-modeled fuel and casing system. The details of the predictive analytical modeling process and the development of normalized relations for momentum partition as a function of SRM burn time and initial geometry are discussed. Methods for applying similar modeling techniques to liquid-tankage-overpressure failures are also discussed. Good agreement between predictions and observations are obtained for five specific events.

  14. Lab-on-chip systems for integrated bioanalyses

    PubMed Central

    Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia

    2016-01-01

    Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042

  15. Anticipating Surprise: Analysis for Strategic Warning

    DTIC Science & Technology

    2002-12-01

    Intentions versus Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2 Introduction to the Analytical Method ...Analysis . . . . . . . . . . . . . . . . . . . . . . 32 Specifics of the Analytical Method . . . . . . . . . . . . . . . . . . . . . . . . 42 3...intelligence. Why is it that “no one’’—a slight but not great exaggeration—believes in the indications method , despite its demonstrably good record in these

  16. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    USGS Publications Warehouse

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  17. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  18. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of simulated Orion landing conditions. This paper details the touchdown detection method chosen and the analysis used to support the decision.

  19. Can machine learning complement traditional medical device surveillance? A case study of dual-chamber implantable cardioverter–defibrillators

    PubMed Central

    Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M

    2017-01-01

    Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874

  20. [Discussion on Quality Evaluation Method of Medical Device During Life-Cycle in Operation Based on the Analytic Hierarchy Process].

    PubMed

    Zheng, Caixian; Zheng, Kun; Shen, Yunming; Wu, Yunyun

    2016-01-01

    The content related to the quality during life-cycle in operation of medical device includes daily use, repair volume, preventive maintenance, quality control and adverse event monitoring. In view of this, the article aims at discussion on the quality evaluation method of medical devices during their life cycle in operation based on the Analytic Hierarchy Process (AHP). The presented method is proved to be effective by evaluating patient monitors as example. The method presented in can promote and guide the device quality control work, and it can provide valuable inputs to decisions about purchase of new device.

  1. Analytic Shielding Optimization to Reduce Crew Exposure to Ionizing Radiation Inside Space Vehicles

    NASA Technical Reports Server (NTRS)

    Gaza, Razvan; Cooper, Tim P.; Hanzo, Arthur; Hussein, Hesham; Jarvis, Kandy S.; Kimble, Ryan; Lee, Kerry T.; Patel, Chirag; Reddell, Brandon D.; Stoffle, Nicholas; hide

    2009-01-01

    A sustainable lunar architecture provides capabilities for leveraging out-of-service components for alternate uses. Discarded architecture elements may be used to provide ionizing radiation shielding to the crew habitat in case of a Solar Particle Event. The specific location relative to the vehicle where the additional shielding mass is placed, as corroborated with particularities of the vehicle design, has a large influence on protection gain. This effect is caused by the exponential- like decrease of radiation exposure with shielding mass thickness, which in turn determines that the most benefit from a given amount of shielding mass is obtained by placing it so that it preferentially augments protection in under-shielded areas of the vehicle exposed to the radiation environment. A novel analytic technique to derive an optimal shielding configuration was developed by Lockheed Martin during Design Analysis Cycle 3 (DAC-3) of the Orion Crew Exploration Vehicle (CEV). [1] Based on a detailed Computer Aided Design (CAD) model of the vehicle including a specific crew positioning scenario, a set of under-shielded vehicle regions can be identified as candidates for placement of additional shielding. Analytic tools are available to allow capturing an idealized supplemental shielding distribution in the CAD environment, which in turn is used as a reference for deriving a realistic shielding configuration from available vehicle components. While the analysis referenced in this communication applies particularly to the Orion vehicle, the general method can be applied to a large range of space exploration vehicles, including but not limited to lunar and Mars architecture components. In addition, the method can be immediately applied for optimization of radiation shielding provided to sensitive electronic components.

  2. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  3. Understanding Personality Development: An Integrative State Process Model

    ERIC Educational Resources Information Center

    Geukes, Katharina; van Zalk, Maarten; Back, Mitja D.

    2018-01-01

    While personality is relatively stable over time, it is also subject to change across the entire lifespan. On a macro-analytical level, empirical research has identified patterns of normative and differential development that are affected by biological and environmental factors, specific life events, and social role investments. On a…

  4. Combining Event- and Variable-Centred Approaches to Institution-Facing Learning Analytics at the Unit of Study Level

    ERIC Educational Resources Information Center

    Kelly, Nick; Montenegro, Maximiliano; Gonzalez, Carlos; Clasing, Paula; Sandoval, Augusto; Jara, Magdalena; Saurina, Elvira; Alarcón, Rosa

    2017-01-01

    Purpose: The purpose of this paper is to demonstrate the utility of combining event-centred and variable-centred approaches when analysing big data for higher education institutions. It uses a large, university-wide data set to demonstrate the methodology for this analysis by using the case study method. It presents empirical findings about…

  5. EPA Science Matters Newsletter: Stand-by Science: EPA Helps the Nation Be Better Prepared for Emergency Response (Published November 2013)

    EPA Pesticide Factsheets

    Learn about the EPA guide (Selected Analytical Methods for Environmental Remediation and Recovery) that helps labs around the country quickly select the appropriate environmental testing and analysis methods to use after a wide-scale chemical event

  6. Second International Conference on Accelerating Biopharmaceutical Development

    PubMed Central

    2009-01-01

    The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme “Delivering cost-effective, robust processes and methods quickly and efficiently.” The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development. PMID:20065637

  7. [Sustainability analysis of an evaluation policy: the case of primary health care in Brazil].

    PubMed

    Felisberto, Eronildo; Freese, Eduardo; Bezerra, Luciana Caroline Albuquerque; Alves, Cinthia Kalyne de Almeida; Samico, Isabella

    2010-06-01

    This study analyzes the sustainability of Brazil's National Policy for the Evaluation of Primary Health Care, based on the identification and categorization of representative critical events in the institutionalization process. This was an evaluative study of two analytical units: Federal Management of Primary Health Care and State Health Secretariats, using multiple case studies with data collected through interviews and institutional documents, using the critical incidents technique. Events that were temporally classified as specific to implementation, sustainability, and mixed were categorized analytically as pertaining to memory, adaptation, values, and rules. Federal management and one of the State Health Secretariats showed medium-level sustainability, while the other State Secretariat showed strong sustainability. The results indicate that the events were concurrent and suggest a weighting process, since the adaptation of activities, adequacy, and stabilization of resources displayed a strong influence on the others. Innovations and the development of technical capability are considered the most important results for sustainability.

  8. The identification of solar wind waves at discrete frequencies and the role of the spectral analysis techniques

    NASA Astrophysics Data System (ADS)

    Di Matteo, S.; Villante, U.

    2017-05-01

    The occurrence of waves at discrete frequencies in the solar wind (SW) parameters has been reported in the scientific literature with some controversial results, mostly concerning the existence (and stability) of favored sets of frequencies. On the other hand, the experimental results might be influenced by the analytical methods adopted for the spectral analysis. We focused attention on the fluctuations of the SW dynamic pressure (PSW) occurring in the leading edges of streams following interplanetary shocks and compared the results of the Welch method (WM) with those of the multitaper method (MTM). The results of a simulation analysis demonstrate that the identification of the wave occurrence and the frequency estimate might be strongly influenced by the signal characteristics and analytical methods, especially in the presence of multicomponent signals. In SW streams, PSW oscillations are routinely detected in the entire range f ≈ 1.2-5.0 mHz; nevertheless, the WM/MTM agreement in the identification and frequency estimate occurs in ≈50% of events and different sets of favored frequencies would be proposed for the same set of events by the WM and MTM analysis. The histogram of the frequency distribution of the events identified by both methods suggests more relevant percentages between f ≈ 1.7-1.9, f ≈ 2.7-3.4, and f ≈ 3.9-4.4 (with a most relevant peak at f ≈ 4.2 mHz). Extremely severe thresholds select a small number (14) of remarkable events, with a one-to-one correspondence between WM and MTM: interestingly, these events reveal a tendency for a favored occurrence in bins centered at f ≈ 2.9 and at f ≈ 4.2 mHz.

  9. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  10. 42 CFR 493.859 - Standard; ABO group and D (Rho) typing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... attain a score of at least 100 percent of acceptable responses for each analyte or test in each testing event is unsatisfactory analyte performance for the testing event. (b) Failure to attain an overall.... (2) For any unacceptable analyte or unsatisfactory testing event score, remedial action must be taken...

  11. 42 CFR 493.859 - Standard; ABO group and D (Rho) typing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... attain a score of at least 100 percent of acceptable responses for each analyte or test in each testing event is unsatisfactory analyte performance for the testing event. (b) Failure to attain an overall.... (2) For any unacceptable analyte or unsatisfactory testing event score, remedial action must be taken...

  12. 42 CFR 493.859 - Standard; ABO group and D (Rho) typing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... attain a score of at least 100 percent of acceptable responses for each analyte or test in each testing event is unsatisfactory analyte performance for the testing event. (b) Failure to attain an overall.... (2) For any unacceptable analyte or unsatisfactory testing event score, remedial action must be taken...

  13. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    PubMed

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  14. Design and analysis of tubular permanent magnet linear generator for small-scale wave energy converter

    NASA Astrophysics Data System (ADS)

    Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young

    2017-05-01

    This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.

  15. Glaucoma progression detection: agreement, sensitivity, and specificity of expert visual field evaluation, event analysis, and trend analysis.

    PubMed

    Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier

    2013-01-01

    To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.

  16. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  17. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation.

    PubMed

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.

  18. An Overview of Conventional and Emerging Analytical Methods for the Determination of Mycotoxins

    PubMed Central

    Cigić, Irena Kralj; Prosen, Helena

    2009-01-01

    Mycotoxins are a group of compounds produced by various fungi and excreted into the matrices on which they grow, often food intended for human consumption or animal feed. The high toxicity and carcinogenicity of these compounds and their ability to cause various pathological conditions has led to widespread screening of foods and feeds potentially polluted with them. Maximum permissible levels in different matrices have also been established for some toxins. As these are quite low, analytical methods for determination of mycotoxins have to be both sensitive and specific. In addition, an appropriate sample preparation and pre-concentration method is needed to isolate analytes from rather complicated samples. In this article, an overview of methods for analysis and sample preparation published in the last ten years is given for the most often encountered mycotoxins in different samples, mainly in food. Special emphasis is on liquid chromatography with fluorescence and mass spectrometric detection, while in the field of sample preparation various solid-phase extraction approaches are discussed. However, an overview of other analytical and sample preparation methods less often used is also given. Finally, different matrices where mycotoxins have to be determined are discussed with the emphasis on their specific characteristics important for the analysis (human food and beverages, animal feed, biological samples, environmental samples). Various issues important for accurate qualitative and quantitative analyses are critically discussed: sampling and choice of representative sample, sample preparation and possible bias associated with it, specificity of the analytical method and critical evaluation of results. PMID:19333436

  19. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  20. Meeting in Florida: Using Asymmetric Flow Field-Flow Fractionation (AF4) to Determine C60 Colloidal Size Distributions

    EPA Science Inventory

    The study of nanomaterials in environmental systems requires robust and specific analytical methods. Analytical methods which discriminate based on particle size and molecular composition are not widely available. Asymmetric Flow Field-Flow Fractionation (AF4) is a separation...

  1. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...

  2. Magnetoresistive biosensors for quantitative proteomics

    NASA Astrophysics Data System (ADS)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  3. The Extracellular Surface of the GLP-1 Receptor Is a Molecular Trigger for Biased Agonism.

    PubMed

    Wootten, Denise; Reynolds, Christopher A; Smith, Kevin J; Mobarec, Juan C; Koole, Cassandra; Savage, Emilia E; Pabreja, Kavita; Simms, John; Sridhar, Rohan; Furness, Sebastian G B; Liu, Mengjie; Thompson, Philip E; Miller, Laurence J; Christopoulos, Arthur; Sexton, Patrick M

    2016-06-16

    Ligand-directed signal bias offers opportunities for sculpting molecular events, with the promise of better, safer therapeutics. Critical to the exploitation of signal bias is an understanding of the molecular events coupling ligand binding to intracellular signaling. Activation of class B G protein-coupled receptors is driven by interaction of the peptide N terminus with the receptor core. To understand how this drives signaling, we have used advanced analytical methods that enable separation of effects on pathway-specific signaling from those that modify agonist affinity and mapped the functional consequence of receptor modification onto three-dimensional models of a receptor-ligand complex. This yields molecular insights into the initiation of receptor activation and the mechanistic basis for biased agonism. Our data reveal that peptide agonists can engage different elements of the receptor extracellular face to achieve effector coupling and biased signaling providing a foundation for rational design of biased agonists. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Adaptable, high recall, event extraction system with minimal configuration.

    PubMed

    Miwa, Makoto; Ananiadou, Sophia

    2015-01-01

    Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration.

  5. Reduction in recurrent cardiovascular events with prasugrel compared with clopidogrel in patients with acute coronary syndromes from the TRITON-TIMI 38 trial

    PubMed Central

    Murphy, Sabina A.; Antman, Elliott M.; Wiviott, Stephen D.; Weerakkody, Govinda; Morocutti, Giorgio; Huber, Kurt; Lopez-Sendon, Jose; McCabe, Carolyn H.; Braunwald, Eugene

    2008-01-01

    Aims In the TRITON-TIMI 38 trial, greater platelet inhibition with prasugrel reduced the first occurrence of the primary endpoint (cardiovascular death, MI, or stroke) compared with clopidogrel in patients with an acute coronary syndrome (ACS) undergoing planned percutaneous coronary intervention. We hypothesized that prasugrel would reduce not only first events but also recurrent primary endpoint events and therefore total events compared with clopidogrel. Methods and results Poisson regression analysis was performed to compare the number of occurrences of the primary endpoint between prasugrel and clopidogrel in TRITON-TIMI 38. Landmark analytic methods were used to evaluate the risk of a recurrent primary endpoint event following an initial non-fatal endpoint event. Among patients with an initial non-fatal event, second events were significantly reduced with prasugrel compared to clopidogrel (10.8 vs. 15.4%, HR 0.65, 95% CI 0.46–0.92; P = 0.016), as was CV death following the non-fatal event (3.7 vs. 7.1%, HR 0.46, 95% CI 0.25–0.82; P = 0.008). Overall there was a reduction of 195 total primary efficacy events with prasugrel vs. clopidogrel (rate ratio 0.79, 95% CI 0.71–0.87; P < 0.001). Recurrent bleeding events occurred infrequently (TIMI major non-CABG bleeds: four with prasugrel and two with clopidogrel). Study drug discontinuation was frequent following the initial major bleeding event (42% of patients discontinued study drug). Conclusion While standard statistical analytic techniques for clinical trials censor patients who experience a component of the primary composite endpoint, total cardiovascular events remain important to both patients and clinicians. Prasugrel, a more potent anti-platelet agent, reduced both first and subsequent cardiovascular events compared with clopidogrel in patients with ACS. PMID:18682445

  6. Directivity analysis of meander-line-coil EMATs with a wholly analytical method.

    PubMed

    Xie, Yuedong; Liu, Zenghua; Yin, Liyuan; Wu, Jiande; Deng, Peng; Yin, Wuliang

    2017-01-01

    This paper presents the simulation and experimental study of the radiation pattern of a meander-line-coil EMAT. A wholly analytical method, which involves the coupling of two models: an analytical EM model and an analytical UT model, has been developed to build EMAT models and analyse the Rayleigh waves' beam directivity. For a specific sensor configuration, Lorentz forces are calculated using the EM analytical method, which is adapted from the classic Deeds and Dodd solution. The calculated Lorentz force density are imported to an analytical ultrasonic model as driven point sources, which produce the Rayleigh waves within a layered medium. The effect of the length of the meander-line-coil on the Rayleigh waves' beam directivity is analysed quantitatively and verified experimentally. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  8. Methods for determination of radioactive substances in water and fluvial sediments

    USGS Publications Warehouse

    Thatcher, Leland Lincoln; Janzer, Victor J.; Edwards, Kenneth W.

    1977-01-01

    Analytical methods for the determination of some of the more important components of fission or neutron activation product radioactivity and of natural radioactivity found in water are reported. The report for each analytical method includes conditions for application of the method, a summary of the method, interferences, required apparatus and reagents, analytical procedures, calculations, reporting of results, and estimation of precision. The fission product isotopes considered are cesium-137, strontium-90, and ruthenium-106. The natural radioelements and isotopes considered are uranium, lead-210, radium-226, radium-228, tritium, and carbon-14. A gross radioactivity survey method and a uranium isotope ratio method are given. When two analytical methods are in routine use for an individual isotope, both methods are reported with identification of the specific areas of application of each. Techniques for the collection and preservation of water samples to be analyzed for radioactivity are discussed.

  9. Analyzing chromatographic data using multilevel modeling.

    PubMed

    Wiczling, Paweł

    2018-06-01

    It is relatively easy to collect chromatographic measurements for a large number of analytes, especially with gradient chromatographic methods coupled with mass spectrometry detection. Such data often have a hierarchical or clustered structure. For example, analytes with similar hydrophobicity and dissociation constant tend to be more alike in their retention than a randomly chosen set of analytes. Multilevel models recognize the existence of such data structures by assigning a model for each parameter, with its parameters also estimated from data. In this work, a multilevel model is proposed to describe retention time data obtained from a series of wide linear organic modifier gradients of different gradient duration and different mobile phase pH for a large set of acids and bases. The multilevel model consists of (1) the same deterministic equation describing the relationship between retention time and analyte-specific and instrument-specific parameters, (2) covariance relationships relating various physicochemical properties of the analyte to chromatographically specific parameters through quantitative structure-retention relationship based equations, and (3) stochastic components of intra-analyte and interanalyte variability. The model was implemented in Stan, which provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods. Graphical abstract Relationships between log k and MeOH content for acidic, basic, and neutral compounds with different log P. CI credible interval, PSA polar surface area.

  10. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    PubMed

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  11. Second International Conference on Accelerating Biopharmaceutical Development: March 9-12, 2009, Coronado, CA USA.

    PubMed

    Reichert, Janice M; Jacob, Nitya; Amanullah, Ashraf

    2009-01-01

    The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.

  12. Second International Conference on Accelerating Biopharmaceutical Development: March 9-12, 2009, Coronado, CA, USA.

    PubMed

    Reichert, Janice M; Jacob, Nitya M; Amanullah, Ashraf

    2009-01-01

    The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.

  13. Analyte-driven switching of DNA charge transport: de novo creation of electronic sensors for an early lung cancer biomarker.

    PubMed

    Thomas, Jason M; Chakraborty, Banani; Sen, Dipankar; Yu, Hua-Zhong

    2012-08-22

    A general approach is described for the de novo design and construction of aptamer-based electrochemical biosensors, for potentially any analyte of interest (ranging from small ligands to biological macromolecules). As a demonstration of the approach, we report the rapid development of a made-to-order electronic sensor for a newly reported early biomarker for lung cancer (CTAP III/NAP2). The steps include the in vitro selection and characterization of DNA aptamer sequences, design and biochemical testing of wholly DNA sensor constructs, and translation to a functional electrode-bound sensor format. The working principle of this distinct class of electronic biosensors is the enhancement of DNA-mediated charge transport in response to analyte binding. We first verify such analyte-responsive charge transport switching in solution, using biochemical methods; successful sensor variants were then immobilized on gold electrodes. We show that using these sensor-modified electrodes, CTAP III/NAP2 can be detected with both high specificity and sensitivity (K(d) ~1 nM) through a direct electrochemical reading. To investigate the underlying basis of analyte binding-induced conductivity switching, we carried out Förster Resonance Energy Transfer (FRET) experiments. The FRET data establish that analyte binding-induced conductivity switching in these sensors results from very subtle structural/conformational changes, rather than large scale, global folding events. The implications of this finding are discussed with respect to possible charge transport switching mechanisms in electrode-bound sensors. Overall, the approach we describe here represents a unique design principle for aptamer-based electrochemical sensors; its application should enable rapid, on-demand access to a class of portable biosensors that offer robust, inexpensive, and operationally simplified alternatives to conventional antibody-based immunoassays.

  14. Sex-specific reference intervals of hematologic and biochemical analytes in Sprague-Dawley rats using the nonparametric rank percentile method.

    PubMed

    He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping

    2017-01-01

    Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.

  15. Major advances in testing of dairy products: milk component and dairy product attribute testing.

    PubMed

    Barbano, D M; Lynch, J M

    2006-04-01

    Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.

  16. The NIH analytical methods and reference materials program for dietary supplements.

    PubMed

    Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M

    2007-09-01

    Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.

  17. A new approach for bioassays based on frequency- and time-domain measurements of magnetic nanoparticles.

    PubMed

    Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag

    2010-01-15

    We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.

  18. Analytical Pitfalls of Therapeutic Drug Monitoring of Thiopurines in Patients With Inflammatory Bowel Disease

    PubMed Central

    Meijer, Berrie; Mulder, Chris J. J.; van Bodegraven, Adriaan A.; de Boer, Nanne K. H.

    2017-01-01

    Abstract: The use of thiopurines in the treatment of inflammatory bowel disease (IBD) can be optimized by the application of therapeutic drug monitoring. In this procedure, 6-thioguanine nucleotides (6-TGN) and 6-methylmercaptopurine (6-MMP) metabolites are monitored and related to therapeutic response and adverse events, respectively. Therapeutic drug monitoring of thiopurines, however, is hampered by several analytical limitations resulting in an impaired translation of metabolite levels to clinical outcome in IBD. Thiopurine metabolism is cell specific and requires nucleated cells and particular enzymes for 6-TGN formation. In the current therapeutic drug monitoring, metabolite levels are assessed in erythrocytes, whereas leukocytes are considered the main target cells of these drugs. Furthermore, currently used methods do not distinguish between active nucleotides and their unwanted residual products. Last, there is a lack of a standardized laboratorial procedure for metabolite assessment regarding the substantial instability of erythrocyte 6-TGN. To improve thiopurine therapy in patients with IBD, it is necessary to understand these limitations and recognize the general misconceptions in this procedure. PMID:29040228

  19. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  20. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment*†

    PubMed Central

    Khan, Md. Ashfaquzzaman; Herbordt, Martin C.

    2011-01-01

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations. PMID:21822327

  1. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment.

    PubMed

    Khan, Md Ashfaquzzaman; Herbordt, Martin C

    2011-07-20

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations.

  2. Exploring the Use of Computer Simulations in Unraveling Research and Development Governance Problems

    NASA Technical Reports Server (NTRS)

    Balaban, Mariusz A.; Hester, Patrick T.

    2012-01-01

    Understanding Research and Development (R&D) enterprise relationships and processes at a governance level is not a simple task, but valuable decision-making insight and evaluation capabilities can be gained from their exploration through computer simulations. This paper discusses current Modeling and Simulation (M&S) methods, addressing their applicability to R&D enterprise governance. Specifically, the authors analyze advantages and disadvantages of the four methodologies used most often by M&S practitioners: System Dynamics (SO), Discrete Event Simulation (DES), Agent Based Modeling (ABM), and formal Analytic Methods (AM) for modeling systems at the governance level. Moreover, the paper describes nesting models using a multi-method approach. Guidance is provided to those seeking to employ modeling techniques in an R&D enterprise for the purposes of understanding enterprise governance. Further, an example is modeled and explored for potential insight. The paper concludes with recommendations regarding opportunities for concentration of future work in modeling and simulating R&D governance relationships and processes.

  3. Sex differences in event-related risk for major depression.

    PubMed

    Maciejewski, P K; Prigerson, H G; Mazure, C M

    2001-05-01

    This study sought to determine if women are more likely than men to experience an episode of major depression in response to stressful life events. Sex differences in event-related risk for depression were examined by means of secondary analyses employing data from the Americans' Changing Lives study. The occurrence and time of occurrence of depression onset and instances of stressful life events within a 12-month period preceding a structured interview were documented in a community-based sample of 1024 men and 1800 women. Survival analytical techniques were used to examine sex differences in risk for depression associated with generic and specific stressful life events. Women were approximately three times more likely than men to experience major depression in response to any stressful life event. Women and men did not differ in risk for depression associated with the death of a spouse or child, events affecting their relationship to a spouse/partner (divorce and marital/love problems) or events corresponding to acute financial or legal difficulties. Women were at elevated risk for depression associated with more distant interpersonal losses (death of a close friend or relative) and other types of events (change of residence, physical attack, or life-threatening illness/injury). Stressful life events overall, with some exceptions among specific event types, pose a greater risk for depression among women compared to men.

  4. Preanalytics in lung cancer.

    PubMed

    Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko

    2015-01-01

    Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.

  5. Statistical error in simulations of Poisson processes: Example of diffusion in solids

    NASA Astrophysics Data System (ADS)

    Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.

    2016-08-01

    Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.

  6. 21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Safe levels and analytical methods for food-producing animals. 530.22 Section 530.22 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH... ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals...

  7. Analysis strategies for longitudinal attachment loss data.

    PubMed

    Beck, J D; Elter, J R

    2000-02-01

    The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.

  8. An overview of health forecasting.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D

    2013-01-01

    Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.

  9. Electrochemical lateral flow immunosensor for detection and quantification of dengue NS1 protein.

    PubMed

    Sinawang, Prima Dewi; Rai, Varun; Ionescu, Rodica E; Marks, Robert S

    2016-03-15

    An Electrochemical Lateral Flow Immunosensor (ELFI) is developed combining screen-printed gold electrodes (SPGE) enabling quantification together with the convenience of a lateral flow test strip. A cellulose glassy fiber paper conjugate pad retains the marker immunoelectroactive nanobeads which will bind to the target analyte of interest. The specific immunorecognition event continues to occur along the lateral flow bed until reaching the SPGE-capture antibodies at the end of the cellulosic lateral flow strip. The rationale of the immunoassay consists in the analyte antigen NS1 protein being captured selectively and specifically by the dengue NS1 antibody conjugated onto the immunonanobeads thus forming an immunocomplex. With the aid of a running buffer, the immunocomplexes flow and reach the immuno-conjugated electrode surface and form specific sandwich-type detection due to specific, molecular recognition, while unbound beads move along past the electrodes. The successful sandwich immunocomplex formation is then recorded electrochemically. Specific detection of NS1 is translated into an electrochemical signal contributed by a redox label present on the bead-immobilized detection dengue NS1 antibody while a proportional increase of faradic current is observed with increase in analyte NS1 protein concentration. The first generation ELFI prototype is simply assembled in a cassette and successfully demonstrates wide linear range over a concentration range of 1-25 ng/mL with an ultrasensitive detection limit of 0.5 ng/mL for the qualitative and quantitative detection of analyte dengue NS1 protein. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Methodological Caveats in the Detection of Coordinated Replay between Place Cells and Grid Cells.

    PubMed

    Trimper, John B; Trettel, Sean G; Hwaun, Ernie; Colgin, Laura Lee

    2017-01-01

    At rest, hippocampal "place cells," neurons with receptive fields corresponding to specific spatial locations, reactivate in a manner that reflects recently traveled trajectories. These "replay" events have been proposed as a mechanism underlying memory consolidation, or the transfer of a memory representation from the hippocampus to neocortical regions associated with the original sensory experience. Accordingly, it has been hypothesized that hippocampal replay of a particular experience should be accompanied by simultaneous reactivation of corresponding representations in the neocortex and in the entorhinal cortex, the primary interface between the hippocampus and the neocortex. Recent studies have reported that coordinated replay may occur between hippocampal place cells and medial entorhinal cortex grid cells, cells with multiple spatial receptive fields. Assessing replay in grid cells is problematic, however, as the cells exhibit regularly spaced spatial receptive fields in all environments and, therefore, coordinated replay between place cells and grid cells may be detected by chance. In the present report, we adapted analytical approaches utilized in recent studies of grid cell and place cell replay to determine the extent to which coordinated replay is spuriously detected between grid cells and place cells recorded from separate rats. For a subset of the employed analytical methods, coordinated replay was detected spuriously in a significant proportion of cases in which place cell replay events were randomly matched with grid cell firing epochs of equal duration. More rigorous replay evaluation procedures and minimum spike count requirements greatly reduced the amount of spurious findings. These results provide insights into aspects of place cell and grid cell activity during rest that contribute to false detection of coordinated replay. The results further emphasize the need for careful controls and rigorous methods when testing the hypothesis that place cells and grid cells exhibit coordinated replay.

  11. Holographic stress-energy tensor near the Cauchy horizon inside a rotating black hole

    NASA Astrophysics Data System (ADS)

    Ishibashi, Akihiro; Maeda, Kengo; Mefford, Eric

    2017-07-01

    We investigate a stress-energy tensor for a conformal field theory (CFT) at strong coupling inside a small five-dimensional rotating Myers-Perry black hole with equal angular momenta by using the holographic method. As a gravitational dual, we perturbatively construct a black droplet solution by applying the "derivative expansion" method, generalizing the work of Haddad [Classical Quantum Gravity 29, 245001 (2012), 10.1088/0264-9381/29/24/245001] and analytically compute the holographic stress-energy tensor for our solution. We find that the stress-energy tensor is finite at both the future and past outer (event) horizons and that the energy density is negative just outside the event horizons due to the Hawking effect. Furthermore, we apply the holographic method to the question of quantum instability of the Cauchy horizon since, by construction, our black droplet solution also admits a Cauchy horizon inside. We analytically show that the null-null component of the holographic stress-energy tensor negatively diverges at the Cauchy horizon, suggesting that a singularity appears there, in favor of strong cosmic censorship.

  12. Adaptable, high recall, event extraction system with minimal configuration

    PubMed Central

    2015-01-01

    Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration. PMID:26201408

  13. Low-Cost Method for Quantifying Sodium in Coconut Water and Seawater for the Undergraduate Analytical Chemistry Laboratory: Flame Test, a Mobile Phone Camera, and Image Processing

    ERIC Educational Resources Information Center

    Moraes, Edgar P.; da Silva, Nilbert S. A.; de Morais, Camilo de L. M.; das Neves, Luiz S.; de Lima, Kassio M. G.

    2014-01-01

    The flame test is a classical analytical method that is often used to teach students how to identify specific metals. However, some universities in developing countries have difficulties acquiring the sophisticated instrumentation needed to demonstrate how to identify and quantify metals. In this context, a method was developed based on the flame…

  14. Investigation of 2‐stage meta‐analysis methods for joint longitudinal and time‐to‐event data through simulation and real data application

    PubMed Central

    Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi

    2017-01-01

    Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814

  15. New analytic results for speciation times in neutral models.

    PubMed

    Gernhard, Tanja

    2008-05-01

    In this paper, we investigate the standard Yule model, and a recently studied model of speciation and extinction, the "critical branching process." We develop an analytic way-as opposed to the common simulation approach-for calculating the speciation times in a reconstructed phylogenetic tree. Simple expressions for the density and the moments of the speciation times are obtained. Methods for dating a speciation event become valuable, if for the reconstructed phylogenetic trees, no time scale is available. A missing time scale could be due to supertree methods, morphological data, or molecular data which violates the molecular clock. Our analytic approach is, in particular, useful for the model with extinction, since simulations of birth-death processes which are conditioned on obtaining n extant species today are quite delicate. Further, simulations are very time consuming for big n under both models.

  16. Interim reliability evaluation program, Browns Ferry fault trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, M.E.

    1981-01-01

    An abbreviated fault tree method is used to evaluate and model Browns Ferry systems in the Interim Reliability Evaluation programs, simplifying the recording and displaying of events, yet maintaining the system of identifying faults. The level of investigation is not changed. The analytical thought process inherent in the conventional method is not compromised. But the abbreviated method takes less time, and the fault modes are much more visible.

  17. Sample Collection Information Document for Pathogens and Biotoxins − Companion to Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events (SAM) Revision 5.0

    EPA Pesticide Factsheets

    Sample Collection Information Document is intended to provide sampling information to be used during site assessment, remediation and clearance activities following a biological or biotoxin contamination incident.

  18. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  19. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation

    PubMed Central

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545

  20. Recent advances in immunosensor for narcotic drug detection

    PubMed Central

    Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman

    2015-01-01

    Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925

  1. Meta-Analysis of Rare Binary Adverse Event Data

    PubMed Central

    Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.

    2013-01-01

    We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068

  2. Determination of steroid hormones and related compounds in filtered and unfiltered water by solid-phase extraction, derivatization, and gas chromatography with tandem mass spectrometry

    USGS Publications Warehouse

    Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.; Barber, Larry B.

    2012-01-01

    A new analytical method has been developed and implemented at the U.S. Geological Survey National Water Quality Laboratory that determines a suite of 20 steroid hormones and related compounds in filtered water (using laboratory schedule 2434) and in unfiltered water (using laboratory schedule 4434). This report documents the procedures and initial performance data for the method and provides guidance on application of the method and considerations of data quality in relation to data interpretation. The analytical method determines 6 natural and 3 synthetic estrogen compounds, 6 natural androgens, 1 natural and 1 synthetic progestin compound, and 2 sterols: cholesterol and 3--coprostanol. These two sterols have limited biological activity but typically are abundant in wastewater effluents and serve as useful tracers. Bisphenol A, an industrial chemical used primarily to produce polycarbonate plastic and epoxy resins and that has been shown to have estrogenic activity, also is determined by the method. A technique referred to as isotope-dilution quantification is used to improve quantitative accuracy by accounting for sample-specific procedural losses in the determined analyte concentration. Briefly, deuterium- or carbon-13-labeled isotope-dilution standards (IDSs), all of which are direct or chemically similar isotopic analogs of the method analytes, are added to all environmental and quality-control and quality-assurance samples before extraction. Method analytes and IDS compounds are isolated from filtered or unfiltered water by solid-phase extraction onto an octadecylsilyl disk, overlain with a graded glass-fiber filter to facilitate extraction of unfiltered sample matrices. The disks are eluted with methanol, and the extract is evaporated to dryness, reconstituted in solvent, passed through a Florisil solid-phase extraction column to remove polar organic interferences, and again evaporated to dryness in a reaction vial. The method compounds are reacted with activated -methyl--trimethylsilyl trifluoroacetamide at 65 degrees Celsius for 1 hour to form trimethylsilyl or trimethylsilyl-enol ether derivatives that are more amenable to gas chromatographic separation than the underivatized compounds. Analysis is carried out by gas chromatography with tandem mass spectrometry using calibration standards that are derivatized concurrently with the sample extracts. Analyte concentrations are quantified relative to specific IDS compounds in the sample, which directly compensate for procedural losses (incomplete recovery) in the determined and reported analyte concentrations. Thus, reported analyte concentrations (or analyte recoveries for spiked samples) are corrected based on recovery of the corresponding IDS compound during the quantification process. Recovery for each IDS compound is reported for each sample and represents an absolute recovery in a manner comparable to surrogate recoveries for other organic methods used by the National Water Quality Laboratory. Thus, IDS recoveries provide a useful tool for evaluating sample-specific analytical performance from an absolute mass recovery standpoint. IDS absolute recovery will differ and typically be lower than the corresponding analyte’s method recovery in spiked samples. However, additional correction of reported analyte concentrations is unnecessary and inappropriate because the analyte concentration (or recovery) already is compensated for by the isotope-dilution quantification procedure. Method analytes were spiked at 10 and 100 nanograms per liter (ng/L) for most analytes (10 times greater spike levels were used for bisphenol A and 100 times greater spike levels were used for 3--coprostanol and cholesterol) into the following validation-sample matrices: reagent water, wastewater-affected surface water, a secondary-treated wastewater effluent, and a primary (no biological treatment) wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100 percent, with overall relative standard deviation of 28 percent. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples and analyzed in 2009–2010 ranged from 84–104 percent, with relative standard deviations of 6–36 percent. Concentrations for two analytes, equilin and progesterone, are reported as estimated because these analytes had excessive bias or variability, or both. Additional database coding is applied to other reported analyte data as needed, based on sample-specific IDS recovery performance. Detection levels were derived statistically by fortifying reagent water at six different levels (0.1 to 4 ng/L) and range from about 0.4 to 4 ng/L for 16 analytes. Interim reporting levels applied to analytes in this report range from 0.8 to 8 ng/L. Bisphenol A and the sterols (cholesterol and 3-beta-coprostanol) were consistently detected in laboratory and field blanks. The minimum reporting levels were set at 100 ng/L for bisphenol A and at 200 ng/L for the two sterols to prevent any bias associated with the presence of these compounds in the blanks. A minimum reporting level of 2 ng/L was set for 11-ketotestosterone to minimize false positive risk from an interfering siloxane compound emanating as chromatographic-column bleed, from vial septum material, or from other sources at no more than 1 ng/L.

  3. High Throughput Determination of Ricinine Abrine and Alpha ...

    EPA Pesticide Factsheets

    Analytical Method This document provides the standard operating procedure for determination of ricinine (RIC), abrine (ABR), and α-amanitin (AMAN) in drinking water by isotope dilution liquid chromatography tandem mass spectrometry (LC/MS/MS). This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving one or a combination of these analytes.

  4. Freeze-thaw approach: A practical sample preparation strategy for residue analysis of multi-class veterinary drugs in chicken muscle.

    PubMed

    Zhang, Meiyu; Li, Erfen; Su, Yijuan; Song, Xuqin; Xie, Jingmeng; Zhang, Yingxia; He, Limin

    2018-06-01

    Seven drugs from different classes, namely, fluoroquinolones (enrofloxacin, ciprofloxacin, sarafloxacin), sulfonamides (sulfadimidine, sulfamonomethoxine), and macrolides (tilmicosin, tylosin), were used as test compounds in chickens by oral administration, a simple extraction step after cryogenic freezing might allow the effective extraction of multi-class veterinary drug residues from minced chicken muscles by mix vortexing. On basis of the optimized freeze-thaw approach, a convenient, selective, and reproducible liquid chromatography with tandem mass spectrometry method was developed. At three spiking levels in blank chicken and medicated chicken muscles, average recoveries of the analytes were in the range of 71-106 and 63-119%, respectively. All the relative standard deviations were <20%. The limits of quantification of analytes were 0.2-5.0 ng/g. Regardless of the chicken levels, there were no significant differences (P > 0.05) in the average contents of almost any of the analytes in medicated chickens between this method and specific methods in the literature for the determination of specific analytes. Finally, the developed method was successfully extended to the monitoring of residues of 55 common veterinary drugs in food animal muscles. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Chloride analysis of concrete by ion-specific potentiometry : its implementation in Virginia.

    DOT National Transportation Integrated Search

    1974-01-01

    In response to an urgent request from the Materials Division, a literature search was conducted to find a suitable analytical method for the determination of chloride in hardened concrete. It was found that an ion-specific potentiometric method emplo...

  6. An Overview of Grain Growth Theories for Pure Single Phase Systems,

    DTIC Science & Technology

    1986-10-01

    the fundamental causes for these distributions. This Blanc and Mocellin (1979) and Carnal and Mocellin (1981j set out to do. 7.1 Monte-Carlo Simulations...termed event B) (in 2-D) of 3-sided grains. (2) Neighbour-switching (termed event C). Blanc and Mocellin (1979) dealt with 2-D sections through...Kurtz and Carpay (1980a). 7.2 Analytical Method to Obtain fn Carnal and Mocellin (1981) obtained the distribution of grain coordination numbers in

  7. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research.

    PubMed

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people's pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan's implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model's membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises.

  8. Track Structure Model for Radial Distributions of Electron Spectra and Event Spectra from High-Energy Ions

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Katz, R.; Wilson, J. W.

    1998-01-01

    An analytic method is described for evaluating the average radial electron spectrum and the radial and total frequency-event spectrum for high-energy ions. For high-energy ions, indirect events make important contributions to frequency-event spectra. The method used for evaluating indirect events is to fold the radial electron spectrum with measured frequency-event spectrum for photons or electrons. The contribution from direct events is treated using a spatially restricted linear energy transfer (LET). We find that high-energy heavy ions have a significantly reduced frequency-averaged final energy (yF) compared to LET, while relativistic protons have a significantly increased yF and dose-averaged lineal energy (yD) for typical site sizes used in tissue equivalent proportional counters. Such differences represent important factors in evaluating event spectra with laboratory beams, in space- flight, or in atmospheric radiation studies and in validation of radiation transport codes. The inadequacy of LET as descriptor because of deviations in values of physical quantities, such as track width, secondary electron spectrum, and yD for ions of identical LET is also discussed.

  9. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  10. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  11. New method for probabilistic traffic demand predictions for en route sectors based on uncertain predictions of individual flight events.

    DOT National Transportation Integrated Search

    2011-06-14

    This paper presents a novel analytical approach to and techniques for translating characteristics of uncertainty in predicting sector entry times and times in sector for individual flights into characteristics of uncertainty in predicting one-minute ...

  12. Creating analytically divergence-free velocity fields from grid-based data

    NASA Astrophysics Data System (ADS)

    Ravu, Bharath; Rudman, Murray; Metcalfe, Guy; Lester, Daniel R.; Khakhar, Devang V.

    2016-10-01

    We present a method, based on B-splines, to calculate a C2 continuous analytic vector potential from discrete 3D velocity data on a regular grid. A continuous analytically divergence-free velocity field can then be obtained from the curl of the potential. This field can be used to robustly and accurately integrate particle trajectories in incompressible flow fields. Based on the method of Finn and Chacon (2005) [10] this new method ensures that the analytic velocity field matches the grid values almost everywhere, with errors that are two to four orders of magnitude lower than those of existing methods. We demonstrate its application to three different problems (each in a different coordinate system) and provide details of the specifics required in each case. We show how the additional accuracy of the method results in qualitatively and quantitatively superior trajectories that results in more accurate identification of Lagrangian coherent structures.

  13. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Recovery and normalization of triple coincidences in PET.

    PubMed

    Lage, Eduardo; Parot, Vicente; Moore, Stephen C; Sitek, Arkadiusz; Udías, Jose M; Dave, Shivang R; Park, Mi-Ae; Vaquero, Juan J; Herraiz, Joaquin L

    2015-03-01

    Triple coincidences in positron emission tomography (PET) are events in which three γ-rays are detected simultaneously. These events, though potentially useful for enhancing the sensitivity of PET scanners, are discarded or processed without special consideration in current systems, because there is not a clear criterion for assigning them to a unique line-of-response (LOR). Methods proposed for recovering such events usually rely on the use of highly specialized detection systems, hampering general adoption, and/or are based on Compton-scatter kinematics and, consequently, are limited in accuracy by the energy resolution of standard PET detectors. In this work, the authors propose a simple and general solution for recovering triple coincidences, which does not require specialized detectors or additional energy resolution requirements. To recover triple coincidences, the authors' method distributes such events among their possible LORs using the relative proportions of double coincidences in these LORs. The authors show analytically that this assignment scheme represents the maximum-likelihood solution for the triple-coincidence distribution problem. The PET component of a preclinical PET/CT scanner was adapted to enable the acquisition and processing of triple coincidences. Since the efficiencies for detecting double and triple events were found to be different throughout the scanner field-of-view, a normalization procedure specific for triple coincidences was also developed. The effect of including triple coincidences using their method was compared against the cases of equally weighting the triples among their possible LORs and discarding all the triple events. The authors used as figures of merit for this comparison sensitivity, noise-equivalent count (NEC) rates and image quality calculated as described in the NEMA NU-4 protocol for the assessment of preclinical PET scanners. The addition of triple-coincidence events with the authors' method increased peak NEC rates of the scanner by 26.6% and 32% for mouse- and rat-sized objects, respectively. This increase in NEC-rate performance was also reflected in the image-quality metrics. Images reconstructed using double and triple coincidences recovered using their method had better signal-to-noise ratio than those obtained using only double coincidences, while preserving spatial resolution and contrast. Distribution of triple coincidences using an equal-weighting scheme increased apparent system sensitivity but degraded image quality. The performance boost provided by the inclusion of triple coincidences using their method allowed to reduce the acquisition time of standard imaging procedures by up to ∼25%. Recovering triple coincidences with the proposed method can effectively increase the sensitivity of current clinical and preclinical PET systems without compromising other parameters like spatial resolution or contrast.

  15. Modeling ozone episodes in the Baltimore-Washington region

    NASA Technical Reports Server (NTRS)

    Ryan, William F.

    1994-01-01

    Surface ozone (O3) concentrations in excess of the National Ambient Air Quality Standard (NAAQS) continue to occur in metropolitan areas in the United States despite efforts to control emissions of O3 precursors. Future O3 control strategies will be based on results from modeling efforts that have just begun in many areas. Two initial questions that arise are model sensitivity to domain-specific conditions and the selection of episodes for model evaluation and control strategy development. For the Baltimore-Washington region (B-W), the presence of the Chesapeake Bay introduces a number of issues relevant to model sensitivity. In this paper, the specific questions of the determination of model volume (mixing height) for the Urban Airshed Model (UAM) is discussed and various alternative methods compared. For the latter question, several analytic approaches, Cluster Analysis and classification and Regression Tree (CART) analysis are undertaken to determine meteorological conditions associated with severe O3 events in the B-W domain.

  16. Cellular Oxygen and Nutrient Sensing in Microgravity Using Time-Resolved Fluorescence Microscopy

    NASA Technical Reports Server (NTRS)

    Szmacinski, Henryk

    2003-01-01

    Oxygen and nutrient sensing is fundamental to the understanding of cell growth and metabolism. This requires identification of optical probes and suitable detection technology without complex calibration procedures. Under this project Microcosm developed an experimental technique that allows for simultaneous imaging of intra- and inter-cellular events. The technique consists of frequency-domain Fluorescence Lifetime Imaging Microscopy (FLIM), a set of identified oxygen and pH probes, and methods for fabrication of microsensors. Specifications for electronic and optical components of FLIM instrumentation are provided. Hardware and software were developed for data acquisition and analysis. Principles, procedures, and representative images are demonstrated. Suitable lifetime sensitive oxygen, pH, and glucose probes for intra- and extra-cellular measurements of analyte concentrations have been identified and tested. Lifetime sensing and imaging have been performed using PBS buffer, culture media, and yeast cells as a model systems. Spectral specifications, calibration curves, and probes availability are also provided in the report.

  17. Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less

  18. Argon thermochronology of mineral deposits; a review of analytical methods, formulations, and selected applications

    USGS Publications Warehouse

    Snee, Lawrence W.

    2002-01-01

    40Ar/39Ar geochronology is an experimentally robust and versatile method for constraining time and temperature in geologic processes. The argon method is the most broadly applied in mineral-deposit studies. Standard analytical methods and formulations exist, making the fundamentals of the method well defined. A variety of graphical representations exist for evaluating argon data. A broad range of minerals found in mineral deposits, alteration zones, and host rocks commonly is analyzed to provide age, temporal duration, and thermal conditions for mineralization events and processes. All are discussed in this report. The usefulness of and evolution of the applicability of the method are demonstrated in studies of the Panasqueira, Portugal, tin-tungsten deposit; the Cornubian batholith and associated mineral deposits, southwest England; the Red Mountain intrusive system and associated Urad-Henderson molybdenum deposits; and the Eastern Goldfields Province, Western Australia.

  19. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. 3D-nanostructured Au electrodes for the event-specific detection of MON810 transgenic maize.

    PubMed

    Fátima Barroso, M; Freitas, Maria; Oliveira, M Beatriz P P; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; Delerue-Matos, Cristina

    2015-03-01

    In the present work, the development of a genosensor for the event-specific detection of MON810 transgenic maize is proposed. Taking advantage of nanostructuration, a cost-effective three dimensional electrode was fabricated and a ternary monolayer containing a dithiol, a monothiol and the thiolated capture probe was optimized to minimize the unspecific signals. A sandwich format assay was selected as a way of precluding inefficient hybridization associated with stable secondary target structures. A comparison between the analytical performance of the Au nanostructured electrodes and commercially available screen-printed electrodes highlighted the superior performance of the nanostructured ones. Finally, the genosensor was effectively applied to detect the transgenic sequence in real samples, showing its potential for future quantitative analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows

    DOE PAGES

    Lee, Kyutae; Muste, Marian

    2017-02-07

    There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less

  2. Refinement of Fread s Method for improved tracking of stream discharges during unsteady flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kyutae; Muste, Marian

    There are a plethora of analytical approaches to account for the effect of unsteady flow (a.k.a. hysteretic behavior) on the conventionally-built steady rating curves (RCs) used to continuously estimate discharges in open channel flow. One of the most complete correction methods is Fread s method (Fread, 1975) which is based on fully dynamic one-dimensional wave equation. Proposed herein is a modified Fread s method which is adjusted to account for the actual geometry of the cross section. This method improves the accuracy associated with the estimation of conveyance factor and energy slope, so it is particularly useful for small tomore » mid-size streams/rivers where the original method s assumption does not properly hold. The modified Fread s method is tested for the sites in Clear Creek (Iowa, USA) and Ebro River (Spain) to illustrate the significance of its improvement in discharge estimation. While the degree of improvement is apparent for the conveyance factor because the hydraulic depth is replaced by hydraulic radius, that for the energy slope term specifically depends on the site and event conditions.« less

  3. Detection of biological molecules using chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph

    2001-01-01

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal. Specifically, the analyte transducer immobilized in a polymeric matrix can be a boronic acid moiety.

  4. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization

    PubMed Central

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution. PMID:28045981

  5. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization.

    PubMed

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael; Ambur, Ole Herman

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution.

  6. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  7. Correlation Between the System Capabilities Analytic Process (SCAP) and the Missions and Means Framework (MMF)

    DTIC Science & Technology

    2013-05-01

    specifics of the correlation will be explored followed by discussion of new paradigms— the ordered event list (OEL) and the decision tree — that result from...4.2.1  Brief Overview of the Decision Tree Paradigm ................................................15  4.2.2  OEL Explained...6  Figure 3. A depiction of a notional fault/activation tree . ................................................................7

  8. Statistical study of mirror mode events in the Earth magnetosheath

    NASA Astrophysics Data System (ADS)

    Genot, V.; Budnik, E.; Jacquey, C.; Sauvaud, J.; Dandouras, I.; Lucek, E.

    2006-12-01

    Using a search and classification tool developed at CDPP (Centre de la Physique des Plasmas, http://cdpp.cesr.fr), we investigate the physics of the mirror instability. Indeed both analytical and observational recent studies have shown the paramount importance of this instability in the development of magnetosheath turbulence and its potential role in reconnection. 5 years of Cluster data have been mined by our tool which can be intuitively parametrized and set up with specific constraints on the actual data content. The strength of the method is illustrated by our results concerning the efficiency of different identification procedures. Beyond the presentation of the general mirror mode event distribution in the magnetosheath, some of the key questions we address include : evolution of the wave amplitude with the fractional distance to the boundaries (bow shock/magnetopause), mirror structure behaviour in relation with 1/ local parameters (plasma beta, temperature anisotropy) and 2/ conditioning parameters (solar wind Mach numbers, IMF orientation), tests of theoretical expressions obtained with different closure equations, ... The implications of these results for the mirror mode modelization is discussed.

  9. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    PubMed

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  10. An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics

    ERIC Educational Resources Information Center

    Abedtash, Hamed

    2017-01-01

    Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…

  11. Methods of Estimating Strategic Intentions

    DTIC Science & Technology

    1982-05-01

    34mental Images of future events" which might be brought to reality. To assess this step In the intention process the analyst may have to consler the...li. DESCRIPTION OF THE INTENTION EST!MATION PROCESS AND RELATED ANALYTICAL AIDS ........ ....... ................ A...OF AIDS AND PROCESS SUMMARY ............................... 115 v . BIBLIOGRAPHY ....................... .. . . . ............... . ...... 138 3on

  12. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  13. To Demonstrate the Specificity of an Enzymatic Method for Plasma Paracetamol Estimation.

    ERIC Educational Resources Information Center

    O'Mullane, John A.

    1987-01-01

    Describes an experiment designed to introduce biochemistry students to the specificity of an analytical method which uses an enzyme to quantitate its substrate. Includes the use of toxicity charts together with the concept of the biological half-life of a drug. (TW)

  14. Analysis of Carbamate Pesticides: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS666

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less

  15. Analysis of Thiodiglycol: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS777

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for the analysis of thiodiglycol, the breakdown product of the sulfur mustard HD, in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS777 (hereafter referred to as EPA CRL SOP MS777). This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to verifymore » the analytical procedures described in MS777 for analysis of thiodiglycol in aqueous samples. The gathered data from this study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS777 can be determined.« less

  16. Measuring target detection performance in paradigms with high event rates.

    PubMed

    Bendixen, Alexandra; Andersen, Søren K

    2013-05-01

    Combining behavioral and neurophysiological measurements inevitably implies mutual constraints, such as when the neurophysiological measurement requires fast-paced stimulus presentation and hence the attribution of a behavioral response to a particular preceding stimulus becomes ambiguous. We develop and test a method for validly assessing behavioral detection performance in spite of this ambiguity. We examine four approaches taken in the literature to treat such situations. We analytically derive a new variant of computing the classical parameters of signal detection theory, hit and false alarm rates, adapted to fast-paced paradigms. Each of the previous approaches shows specific shortcomings (susceptibility towards response window choice, biased estimates of behavioral detection performance). Superior performance of our new approach is demonstrated for both simulated and empirical behavioral data. Further evidence is provided by reliable correspondence between behavioral performance and the N2b component as an electrophysiological indicator of target detection. The appropriateness of our approach is substantiated by both theoretical and empirical arguments. We demonstrate an easy-to-implement solution for measuring target detection performance independent of the rate of event presentation. Thus overcoming the measurement bias of previous approaches, our method will help to clarify the behavioral relevance of different measures of cortical activation. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Analytical performance specifications for changes in assay bias (Δbias) for data with logarithmic distributions as assessed by effects on reference change values.

    PubMed

    Petersen, Per H; Lund, Flemming; Fraser, Callum G; Sölétormos, György

    2016-11-01

    Background The distributions of within-subject biological variation are usually described as coefficients of variation, as are analytical performance specifications for bias, imprecision and other characteristics. Estimation of specifications required for reference change values is traditionally done using relationship between the batch-related changes during routine performance, described as Δbias, and the coefficients of variation for analytical imprecision (CV A ): the original theory is based on standard deviations or coefficients of variation calculated as if distributions were Gaussian. Methods The distribution of between-subject biological variation can generally be described as log-Gaussian. Moreover, recent analyses of within-subject biological variation suggest that many measurands have log-Gaussian distributions. In consequence, we generated a model for the estimation of analytical performance specifications for reference change value, with combination of Δbias and CV A based on log-Gaussian distributions of CV I as natural logarithms. The model was tested using plasma prolactin and glucose as examples. Results Analytical performance specifications for reference change value generated using the new model based on log-Gaussian distributions were practically identical with the traditional model based on Gaussian distributions. Conclusion The traditional and simple to apply model used to generate analytical performance specifications for reference change value, based on the use of coefficients of variation and assuming Gaussian distributions for both CV I and CV A , is generally useful.

  18. Bayesian analysis of caustic-crossing microlensing events

    NASA Astrophysics Data System (ADS)

    Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.

    2010-06-01

    Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.

  19. Dynamic Analytics-Driven Assessment of Vulnerabilities and Exploitation

    DTIC Science & Technology

    2016-07-15

    integration with big data technologies such as Hadoop , nor does it natively support exporting of events to external relational databases. OSSIM supports...power of big data analytics to determine correlations and temporal causality among vulnerabilities and cyber events. The vulnerability dependencies...via the SCAPE (formerly known as LLCySA [6]). This is illustrated as a big data cyber analytic system architecture in

  20. Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.

    2010-06-07

    Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less

  1. Biolayer modeling and optimization for the SPARROW biosensor

    NASA Astrophysics Data System (ADS)

    Feng, Ke

    2007-12-01

    Biosensor direct detection of molecular binding events is of significant interest in applications from molecular screening for cancer drug design to bioagent detection for homeland security and defense. The Stacked Planar Affinity Regulated Resonant Optical Waveguide (SPARROW) structure based on coupled waveguides was recently developed to achieve increased sensitivity within a fieldable biosensor device configuration. Under ideal operating conditions, modification of the effective propagation constant of the structure's sensing waveguide through selective attachment of specific targets to probes on the waveguide surface results in a change in the coupling characteristics of the guide over a specifically designed interaction length with the analyte. Monitoring the relative power in each waveguide after interaction enables 'recognition' of those targets which have selectively bound to the surface. However, fabrication tolerances, waveguide interface roughness, biolayer surface roughness and biolayer partial coverage have an effect on biosensor behavior and achievable limit of detection (LOD). In addition to these influences which play a role in device optimization, the influence of the spatially random surface loading of molecular binding events has to be considered, especially for low surface coverage. In this dissertation an analytic model is established for the SPARROW biosensor which accounts for these nonidealities with which the design of the biosensor can be guided and optimized. For the idealized case of uniform waveguide transducer layers and biolayer, both theoretical simulation (analytical expression) and computer simulation (numerical calculation) are completed. For the nonideal case of an inhomogeneous transducer with nonideal waveguide and biolayer surfaces, device output power is affected by such physical influences as surface scattering, coupling length, absorption, and percent coverage of binding events. Using grating and perturbation techniques we explore the influence of imperfect surfaces and random surface loading on scattering loss and coupling length. Results provide a range of achievable limits of detection in the SPARROW device for a given target size, surface loading, and detectable optical power.

  2. Device and method for enhanced collection and assay of chemicals with high surface area ceramic

    DOEpatents

    Addleman, Raymond S.; Li, Xiaohong Shari; Chouyyok, Wilaiwan; Cinson, Anthony D.; Bays, John T.; Wallace, Krys

    2016-02-16

    A method and device for enhanced capture of target analytes is disclosed. This invention relates to collection of chemicals for separations and analysis. More specifically, this invention relates to a solid phase microextraction (SPME) device having better capability for chemical collection and analysis. This includes better physical stability, capacity for chemical collection, flexible surface chemistry and high affinity for target analyte.

  3. Structure-Based Prediction of Unstable Regions in Proteins: Applications to Protein Misfolding Diseases

    NASA Astrophysics Data System (ADS)

    Guest, Will; Cashman, Neil; Plotkin, Steven

    2009-03-01

    Protein misfolding is a necessary step in the pathogenesis of many diseases, including Creutzfeldt-Jakob disease (CJD) and familial amyotrophic lateral sclerosis (fALS). Identifying unstable structural elements in their causative proteins elucidates the early events of misfolding and presents targets for inhibition of the disease process. An algorithm was developed to calculate the Gibbs free energy of unfolding for all sequence-contiguous regions of a protein using three methods to parameterize energy changes: a modified G=o model, changes in solvent-accessible surface area, and solution of the Poisson-Boltzmann equation. The entropic effects of disulfide bonds and post-translational modifications are treated analytically. It incorporates a novel method for finding local dielectric constants inside a protein to accurately handle charge effects. We have predicted the unstable parts of prion protein and superoxide dismutase 1, the proteins involved in CJD and fALS respectively, and have used these regions as epitopes to prepare antibodies that are specific to the misfolded conformation and show promise as therapeutic agents.

  4. Identifying Unstable Regions of Proteins Involved in Misfolding Diseases

    NASA Astrophysics Data System (ADS)

    Guest, Will; Cashman, Neil; Plotkin, Steven

    2009-05-01

    Protein misfolding is a necessary step in the pathogenesis of many diseases, including Creutzfeldt-Jakob disease (CJD) and familial amyotrophic lateral sclerosis (fALS). Identifying unstable structural elements in their causative proteins elucidates the early events of misfolding and presents targets for inhibition of the disease process. An algorithm was developed to calculate the Gibbs free energy of unfolding for all sequence-contiguous regions of a protein using three methods to parameterize energy changes: a modified G=o model, changes in solvent-accessible surface area, and all-atoms molecular dynamics. The entropic effects of disulfide bonds and post-translational modifications are treated analytically. It incorporates a novel method for finding local dielectric constants inside a protein to accurately handle charge effects. We have predicted the unstable parts of prion protein and superoxide dismutase 1, the proteins involved in CJD and fALS respectively, and have used these regions as epitopes to prepare antibodies that are specific to the misfolded conformation and show promise as therapeutic agents.

  5. Reduced Order Modeling for Rapid Simulations of Blast and Rollover Events of a Ground Vehicle and its Occupants Using Rigid Body Dynamic Models

    DTIC Science & Technology

    2013-03-11

    information is available. A 4-point harness system including lap and shoulder beltsand center buckle was positioned on the dummy, as shown in Figure 3.1... Lamb , STE/Analytics, US Army TARDEC  Dr. Tom McGrath, US Navy NSWC-IHD  Mr. Kirk Miller, OCP-TECD Standards and Specifications, TARDEC/GSS  Mr

  6. Development and in-house validation of the event-specific polymerase chain reaction detection methods for genetically modified soybean MON89788 based on the cloned integration flanking sequence.

    PubMed

    Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing

    2009-11-25

    Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.

  7. The Fibonacci Life-Chart Method (FLCM) as a foundation for Carl Jung's theory of synchronicity.

    PubMed

    Sacco, Robert G

    2016-04-01

    Since the scientific method requires events to be subject to controlled examination it would seem that synchronicities are not scientifically investigable. Jung speculated that because these incredible events are like the random sparks of a firefly they cannot be pinned down. However, doubting Jung's doubts, the author provides a possible method of elucidating these seemingly random and elusive events. The author draws on a new method, designated the Fibonacci Life-Chart Method (FLCM), which categorizes phase transitions and Phi fractal scaling in human development based on the occurrence of Fibonacci numbers in biological cell division and self-organizing systems. The FLCM offers an orientation towards psychological experience that may have relevance to Jung's theory of synchronicity in which connections are deemed to be intrinsically meaningful rather than demonstrable consequences of cause and effect. In such a model synchronistic events can be seen to be, as the self-organizing system enlarges, manifestations of self-organized critical moments and Phi fractal scaling. Recommendations for future studies include testing the results of the FLCM using case reports of synchronistic and spiritual experiences. © 2016, The Society of Analytical Psychology.

  8. A short history, principles, and types of ELISA, and our laboratory experience with peptide/protein analyses using ELISA.

    PubMed

    Aydin, Suleyman

    2015-10-01

    Playing a critical role in the metabolic homeostasis of living systems, the circulating concentrations of peptides/proteins are influenced by a variety of patho-physiological events. These peptide/protein concentrations in biological fluids are measured using various methods, the most common of which is enzymatic immunoassay EIA/ELISA and which guide the clinicians in diagnosing and monitoring diseases that inflict biological systems. All the techniques where enzymes are employed to show antigen-antibody reactions are generally referred to as enzymatic immunoassay EIA/ELISA method. Since the basic principles of EIA and ELISA are the same. The main objective of this review is to present an overview of the historical journey that had led to the invention of EIA/ELISA, an indispensible method for medical and research laboratories, types of ELISA developed after its invention [direct (the first ELISA method invented), indirect, sandwich and competitive methods], problems encountered during peptide/protein analyses (pre-analytical, analytical and post-analytical), rules to be followed to prevent these problems, and our laboratory experience of more than 15 years. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Development of the Vaccine Analytic Unit's research agenda for investigating potential adverse events associated with anthrax vaccine adsorbed.

    PubMed

    Payne, Daniel C; Franzke, Laura H; Stehr-Green, Paul A; Schwartz, Benjamin; McNeil, Michael M

    2007-01-01

    In 2002, the Centers for Disease Control and Prevention established the Vaccine Analytic Unit (VAU) in collaboration with the Department of Defense (DoD). The focus of this report is to describe the process by which the VAU's anthrax vaccine safety research plan was developed following a comprehensive review of these topics. Public health literature, surveillance data, and clinical sources were reviewed to create a list of adverse events hypothesized to be potentially related to anthrax vaccine adsorbed (AVA). From this list, a consensus process was used to select 11 important research topics. Adverse event background papers were written for each of these topics, addressing predetermined criteria. These were independently reviewed and ranked by a National Vaccine Advisory Committee (NVAC) workgroup. The adverse events included in the final priority list will be the subject of observational or other post marketing surveillance studies using the Defense Medical Surveillance System (DMSS) database. A review of various information sources identified over 100 potential adverse events. The review process recommended 11 topics as potentially warranting further study. The NVAC workgroup identified the following adverse event topics for study: arthritis, optic neuritis, and Stevens-Johnson syndrome/Toxic epidermal necrolysis. Two additional topics (systemic lupus erythematosus (SLE) and multiple, near-concurrent military vaccinations) were added in response to emerging public health and military concerns. The experience described, while specific for establishing the VAU's research agenda for the safety of the current anthrax vaccine, may be useful and adapted for research planning in other areas of public health research. Copyright (c) 2006 John Wiley & Sons, Ltd.

  10. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  11. Investigation of 2-stage meta-analysis methods for joint longitudinal and time-to-event data through simulation and real data application.

    PubMed

    Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi

    2018-04-15

    Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  12. Nuclear and analytical methods for investigation of high quality wines

    NASA Astrophysics Data System (ADS)

    Tonev, D.; Geleva, E.; Grigorov, T.; Goutev, N.; Protohristov, H.; Stoyanov, Ch; Bashev, V.; Tringovska, I.; Kostova, D.

    2018-05-01

    Nuclear and analytical methods can help to determine the year of production – vintage and the geographical provenance of high quality wines. A complex analytical investigation of Melnik fine wines from “Artarkata” vineyards, Vinogradi village near Melnik in Southwest Bulgaria using different methods and equipment were performed. Nuclear methods, based on measured gamma-ray activity of 137Cs and specific activity of 3H can be used to determine the year of wine production. The specific activity of 137Cs was measured in wines from different vintages using Low-Background High-Resolution Gamma-Spectrometry. Tritium measurements in wine samples were carried out by using a low level liquid scintillation counting in a Packard Tri-Carb 2770 TR/SL liquid scintillation analyzer. The identification of the origin of wines using their chemical fingerprints is of great interest for wine consumers and producers. Determination of 16 chemical elements in samples from soil, wine stems, wine leaves and fine wine from the type Shiroka Melnishka, which are grown in typical Melnik vineyard was made, using Inductively Coupled Plasma-Optical Emission Spectrometry (ICP-OES).

  13. Ion-pairing HPLC methods to determine EDTA and DTPA in small molecule and biological pharmaceutical formulations.

    PubMed

    Wang, George; Tomasella, Frank P

    2016-06-01

    Ion-pairing high-performance liquid chromatography-ultraviolet (HPLC-UV) methods were developed to determine two commonly used chelating agents, ethylenediaminetetraacetic acid (EDTA) in Abilify® (a small molecule drug with aripiprazole as the active pharmaceutical ingredient) oral solution and diethylenetriaminepentaacetic acid (DTPA) in Yervoy® (a monoclonal antibody drug with ipilimumab as the active pharmaceutical ingredient) intravenous formulation. Since the analytes, EDTA and DTPA, do not contain chromophores, transition metal ions (Cu 2+ , Fe 3+ ) which generate highly stable metallocomplexes with the chelating agents were added into the sample preparation to enhance UV detection. The use of metallocomplexes with ion-pairing chromatography provides the ability to achieve the desired sensitivity and selectivity in the development of the method. Specifically, the sample preparation involving metallocomplex formation allowed sensitive UV detection. Copper was utilized for the determination of EDTA and iron was utilized for the determination of DTPA. In the case of EDTA, a gradient mobile phase separated the components of the formulation from the analyte. In the method for DTPA, the active drug substance, ipilimumab, was eluted in the void. In addition, the optimization of the concentration of the ion-pairing reagent was discussed as a means of enhancing the retention of the aminopolycarboxylic acids (APCAs) including EDTA and DTPA and the specificity of the method. The analytical method development was designed based on the chromatographic properties of the analytes, the nature of the sample matrix and the intended purpose of the method. Validation data were presented for the two methods. Finally, both methods were successfully utilized in determining the fate of the chelates.

  14. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    PubMed Central

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859

  15. Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes

    NASA Astrophysics Data System (ADS)

    Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.

    2015-12-01

    Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.

  16. Comparison of three DNA extraction methods for the detection and quantification of GMO in Ecuadorian manufactured food.

    PubMed

    Pacheco Coello, Ricardo; Pestana Justo, Jorge; Factos Mendoza, Andrés; Santos Ordoñez, Efrén

    2017-12-20

    In Ecuador, food products need to be labeled if exceeded 0.9% of transgenic content in whole products. For the detection of genetically modified organisms (GMOs), three DNA extraction methods were tested in 35 food products commercialized in Ecuador. Samples with positive amplification of endogenous genes were screened for the presence of the Cauliflower mosaic virus 35S-promoter (P35S) and the nopaline synthase-terminator (Tnos). TaqMan™ probes were used for determination of transgenic content of the GTS 40-3-2 and MON810 events through quantitative PCR (qPCR). Twenty-six processed food samples were positive for the P35S alone and eight samples for the Tnos and P35S. Absolute qPCR results indicated that eleven samples were positive for GTS 40-3-2 specific event and two for MON810 specific event. A total of nine samples for events GTS 40-3-2 and MON810 exceeded the umbral allowed of transgenic content in the whole food product with the specific events. Different food products may require different DNA extraction protocols for GMO detection through PCR. Among the three methods tested, the DNeasy mericon food kit DNA extraction method obtained higher proportion of amplified endogenous genes through PCR. Finally, event-specific GMOs were detected in food products in Ecuador.

  17. Recovery and normalization of triple coincidences in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lage, Eduardo, E-mail: elage@mit.edu; Parot, Vicente; Dave, Shivang R.

    2015-03-15

    Purpose: Triple coincidences in positron emission tomography (PET) are events in which three γ-rays are detected simultaneously. These events, though potentially useful for enhancing the sensitivity of PET scanners, are discarded or processed without special consideration in current systems, because there is not a clear criterion for assigning them to a unique line-of-response (LOR). Methods proposed for recovering such events usually rely on the use of highly specialized detection systems, hampering general adoption, and/or are based on Compton-scatter kinematics and, consequently, are limited in accuracy by the energy resolution of standard PET detectors. In this work, the authors propose amore » simple and general solution for recovering triple coincidences, which does not require specialized detectors or additional energy resolution requirements. Methods: To recover triple coincidences, the authors’ method distributes such events among their possible LORs using the relative proportions of double coincidences in these LORs. The authors show analytically that this assignment scheme represents the maximum-likelihood solution for the triple-coincidence distribution problem. The PET component of a preclinical PET/CT scanner was adapted to enable the acquisition and processing of triple coincidences. Since the efficiencies for detecting double and triple events were found to be different throughout the scanner field-of-view, a normalization procedure specific for triple coincidences was also developed. The effect of including triple coincidences using their method was compared against the cases of equally weighting the triples among their possible LORs and discarding all the triple events. The authors used as figures of merit for this comparison sensitivity, noise-equivalent count (NEC) rates and image quality calculated as described in the NEMA NU-4 protocol for the assessment of preclinical PET scanners. Results: The addition of triple-coincidence events with the authors’ method increased peak NEC rates of the scanner by 26.6% and 32% for mouse- and rat-sized objects, respectively. This increase in NEC-rate performance was also reflected in the image-quality metrics. Images reconstructed using double and triple coincidences recovered using their method had better signal-to-noise ratio than those obtained using only double coincidences, while preserving spatial resolution and contrast. Distribution of triple coincidences using an equal-weighting scheme increased apparent system sensitivity but degraded image quality. The performance boost provided by the inclusion of triple coincidences using their method allowed to reduce the acquisition time of standard imaging procedures by up to ∼25%. Conclusions: Recovering triple coincidences with the proposed method can effectively increase the sensitivity of current clinical and preclinical PET systems without compromising other parameters like spatial resolution or contrast.« less

  18. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  19. Safety and Waste Management for SAM Pathogen Methods

    EPA Pesticide Factsheets

    The General Safety and Waste Management page offers section-specific safety and waste management details for the pathogens included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  20. Safety and Waste Management for SAM Biotoxin Methods

    EPA Pesticide Factsheets

    The General Safety and Waste Management page offers section-specific safety and waste management details for the biotoxins included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  1. HIV cure research community engagement in North Carolina: a mixed-methods evaluation of a crowdsourcing contest.

    PubMed

    Mathews, Allison; Farley, Samantha; Blumberg, Meredith; Knight, Kimberley; Hightow-Weidman, Lisa; Muessig, Kate; Rennie, Stuart; Tucker, Joseph

    2017-10-01

    The purpose of this study was to evaluate the feasibility of using a crowdsourcing contest to promote HIV cure research community engagement. Crowdsourcing contests are open calls for community participation to achieve a task, in this case to engage local communities about HIV cure research. Our contest solicited images and videos of what HIV cure meant to people. Contestants submitted entries to IdeaScale, an encrypted online contest platform. We used a mixed-methods study design to evaluate the contest. Engagement was assessed through attendance at promotional events and social media user analytics. Google Analytics measured contest website user-engagement statistics. Text from contest video entries was transcribed, coded and analysed using MAXQDA. There were 144 attendees at three promotional events and 32 entries from 39 contestants. Most individuals who submitted entries were black ( n =31), had some college education ( n =18) and were aged 18-23 years ( n =23). Social media analytics showed 684 unique page followers, 2233 unique page visits, 585 unique video views and an overall reach of 80,624 unique users. Contest submissions covered themes related to the community's role in shaping the future of HIV cure through education, social justice, creativity and stigma reduction. Crowdsourcing contests are feasible for engaging community members in HIV cure research. Community contributions to crowdsourcing contests provide useful content for culturally relevant and locally responsive research engagement.

  2. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  3. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Predicting adverse hemodynamic events in critically ill patients.

    PubMed

    Yoon, Joo H; Pinsky, Michael R

    2018-06-01

    The art of predicting future hemodynamic instability in the critically ill has rapidly become a science with the advent of advanced analytical processed based on computer-driven machine learning techniques. How these methods have progressed beyond severity scoring systems to interface with decision-support is summarized. Data mining of large multidimensional clinical time-series databases using a variety of machine learning tools has led to our ability to identify alert artifact and filter it from bedside alarms, display real-time risk stratification at the bedside to aid in clinical decision-making and predict the subsequent development of cardiorespiratory insufficiency hours before these events occur. This fast evolving filed is primarily limited by linkage of high-quality granular to physiologic rationale across heterogeneous clinical care domains. Using advanced analytic tools to glean knowledge from clinical data streams is rapidly becoming a reality whose clinical impact potential is great.

  5. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  6. Post-standardization of routine creatinine assays: are they suitable for clinical applications.

    PubMed

    Jassam, Nuthar; Weykamp, Cas; Thomas, Annette; Secchiero, Sandra; Sciacovelli, Laura; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Perich, Carmen; Ricós, Carmen; Paula, Faria A; Barth, Julian H

    2017-05-01

    Introduction Reliable serum creatinine measurements are of vital importance for the correct classification of chronic kidney disease and early identification of kidney injury. The National Kidney Disease Education Programme working group and other groups have defined clinically acceptable analytical limits for creatinine methods. The aim of this study was to re-evaluate the performance of routine creatinine methods in the light of these defined limits so as to assess their suitability for clinical practice. Method In collaboration with the Dutch External Quality Assurance scheme, six frozen commutable samples, with a creatinine concentration ranging from 80 to 239  μmol/L and traceable to isotope dilution mass spectrometry, were circulated to 91 laboratories in four European countries for creatinine measurement and estimated glomerular filtration rate calculation. Two out of the six samples were spiked with glucose to give high and low final concentrations of glucose. Results Results from 89 laboratories were analysed for bias, imprecision (%CV) for each creatinine assay and total error for estimated glomerular filtration rate. The participating laboratories used analytical instruments from four manufacturers; Abbott, Beckman, Roche and Siemens. All enzymatic methods in this study complied with the National Kidney Disease Education Programme working group recommended limits of bias of 5% above a creatinine concentration of 100  μmol/L. They also did not show any evidence of interference from glucose. In addition, they also showed compliance with the clinically recommended %CV of ≤4% across the analytical range. In contrast, the Jaffe methods showed variable performance with regard to the interference of glucose and unsatisfactory bias and precision. Conclusion Jaffe-based creatinine methods still exhibit considerable analytical variability in terms of bias, imprecision and lack of specificity, and this variability brings into question their clinical utility. We believe that clinical laboratories and manufacturers should work together to phase out the use of relatively non-specific Jaffe methods and replace them with more specific methods that are enzyme based.

  7. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  8. Development, optimization, and single laboratory validation of an event-specific real-time PCR method for the detection and quantification of Golden Rice 2 using a novel taxon-specific assay.

    PubMed

    Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco

    2015-02-18

    In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.

  9. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  10. Optimal Futility Interim Design: A Predictive Probability of Success Approach with Time-to-Event Endpoint.

    PubMed

    Tang, Zhongwen

    2015-01-01

    An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.

  11. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  12. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  13. No Impact of the Analytical Method Used for Determining Cystatin C on Estimating Glomerular Filtration Rate in Children.

    PubMed

    Alberer, Martin; Hoefele, Julia; Benz, Marcus R; Bökenkamp, Arend; Weber, Lutz T

    2017-01-01

    Measurement of inulin clearance is considered to be the gold standard for determining kidney function in children, but this method is time consuming and expensive. The glomerular filtration rate (GFR) is on the other hand easier to calculate by using various creatinine- and/or cystatin C (Cys C)-based formulas. However, for the determination of serum creatinine (Scr) and Cys C, different and non-interchangeable analytical methods exist. Given the fact that different analytical methods for the determination of creatinine and Cys C were used in order to validate existing GFR formulas, clinicians should be aware of the type used in their local laboratory. In this study, we compared GFR results calculated on the basis of different GFR formulas and either used Scr and Cys C values as determined by the analytical method originally employed for validation or values obtained by an alternative analytical method to evaluate any possible effects on the performance. Cys C values determined by means of an immunoturbidimetric assay were used for calculating the GFR using equations in which this analytical method had originally been used for validation. Additionally, these same values were then used in other GFR formulas that had originally been validated using a nephelometric immunoassay for determining Cys C. The effect of using either the compatible or the possibly incompatible analytical method for determining Cys C in the calculation of GFR was assessed in comparison with the GFR measured by creatinine clearance (CrCl). Unexpectedly, using GFR equations that employed Cys C values derived from a possibly incompatible analytical method did not result in a significant difference concerning the classification of patients as having normal or reduced GFR compared to the classification obtained on the basis of CrCl. Sensitivity and specificity were adequate. On the other hand, formulas using Cys C values derived from a compatible analytical method partly showed insufficient performance when compared to CrCl. Although clinicians should be aware of applying a GFR formula that is compatible with the locally used analytical method for determining Cys C and creatinine, other factors might be more crucial for the calculation of correct GFR values.

  14. Analysis of Phosphonic Acids: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, J; Vu, A; Koester, C

    The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled Analysis of Diisopropyl Methylphosphonate, Ethyl Hydrogen Dimethylamidophosphate, Isopropyl Methylphosphonic Acid, Methylphosphonic Acid, and Pinacolyl Methylphosphonic Acid in Water by Multiple Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry: EPA Version MS999. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures describedmore » in EPA Method MS999 for analysis of the listed phosphonic acids and surrogates in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of EPA Method MS999 can be determined.« less

  15. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Analytic strategies to evaluate the association of time-varying exposures to HIV-related outcomes: Alcohol consumption as an example.

    PubMed

    Cook, Robert L; Kelso, Natalie E; Brumback, Babette A; Chen, Xinguang

    2016-01-01

    As persons with HIV are living longer, there is a growing need to investigate factors associated with chronic disease, rate of disease progression and survivorship. Many risk factors for this high-risk population change over time, such as participation in treatment, alcohol consumption and drug abuse. Longitudinal datasets are increasingly available, particularly clinical data that contain multiple observations of health exposures and outcomes over time. Several analytic options are available for assessment of longitudinal data; however, it can be challenging to choose the appropriate analytic method for specific combinations of research questions and types of data. The purpose of this review is to help researchers choose the appropriate methods to analyze longitudinal data, using alcohol consumption as an example of a time-varying exposure variable. When selecting the optimal analytic method, one must consider aspects of exposure (e.g. timing, pattern, and amount) and outcome (fixed or time-varying), while also addressing minimizing bias. In this article, we will describe several analytic approaches for longitudinal data, including developmental trajectory analysis, generalized estimating equations, and mixed effect models. For each analytic strategy, we describe appropriate situations to use the method and provide an example that demonstrates the use of the method. Clinical data related to alcohol consumption and HIV are used to illustrate these methods.

  17. Enabling fluorescent biosensors for the forensic identification of body fluids.

    PubMed

    Frascione, Nunzianda; Gooch, James; Daniel, Barbara

    2013-11-12

    The search for body fluids often forms a crucial element of many forensic investigations. Confirming fluid presence at a scene can not only support or refute the circumstantial claims of a victim, suspect or witness, but may additionally provide a valuable source of DNA for further identification purposes. However, current biological fluid testing techniques are impaired by a number of well-characterised limitations; they often give false positives, cannot be used simultaneously, are sample destructive and lack the ability to visually locate fluid depositions. These disadvantages can negatively affect the outcome of a case through missed or misinterpreted evidence. Biosensors are devices able to transduce a biological recognition event into a measurable signal, resulting in real-time analyte detection. The use of innovative optical sensing technology may enable the highly specific and non-destructive detection of biological fluid depositions through interaction with several fluid-endogenous biomarkers. Despite considerable impact in a variety of analytical disciplines, biosensor application within forensic analyses may be considered extremely limited. This article aims to explore a number of prospective biosensing mechanisms and to outline the challenges associated with their adaptation towards detection of fluid-specific analytes.

  18. INITIAL ANALYSIS OF TRANSIENT POWER TIME LAG DUE TO HETEROGENEITY WITHIN THE TREAT FUEL MATRIX.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.M. Wachs; A.X. Zabriskie, W.R. Marcum

    2014-06-01

    The topic Nuclear Safety encompasses a broad spectrum of focal areas within the nuclear industry; one specific aspect centers on the performance and integrity of nuclear fuel during a reactivity insertion accident (RIA). This specific accident has proven to be fundamentally difficult to theoretically characterize due to the numerous empirically driven characteristics that quantify the fuel and reactor performance. The Transient Reactor Test (TREAT) facility was designed and operated to better understand fuel behavior under extreme (i.e. accident) conditions; it was shutdown in 1994. Recently, efforts have been underway to commission the TREAT facility to continue testing of advanced accidentmore » tolerant fuels (i.e. recently developed fuel concepts). To aid in the restart effort, new simulation tools are being used to investigate the behavior of nuclear fuels during facility’s transient events. This study focuses specifically on the characterizing modeled effects of fuel particles within the fuel matrix of the TREAT. The objective of this study was to (1) identify the impact of modeled heterogeneity within the fuel matrix during a transient event, and (2) demonstrate acceptable modeling processes for the purpose of TREAT safety analyses, specific to fuel matrix and particle size. Hypothetically, a fuel that is dominantly heterogeneous will demonstrate a clearly different temporal heating response to that of a modeled homogeneous fuel. This time difference is a result of the uniqueness of the thermal diffusivity within the fuel particle and fuel matrix. Using MOOSE/BISON to simulate the temperature time-lag effect of fuel particle diameter during a transient event, a comparison of the average graphite moderator temperature surrounding a spherical particle of fuel was made for both types of fuel simulations. This comparison showed that at a given time and with a specific fuel particle diameter, the fuel particle (heterogeneous) simulation and the homogeneous simulation were related by a multiplier relative to the average moderator temperature. As time increases the multiplier is comparable to the factor found in a previous analytical study from literature. The implementation of this multiplier and the method of analysis may be employed to remove assumptions and increase fidelity for future research on the effect of fuel particles during transient events.« less

  19. Event-by-Event Simulations of Early Gluon Fields in High Energy Nuclear Collisions

    NASA Astrophysics Data System (ADS)

    Nickel, Matthew; Rose, Steven; Fries, Rainer

    2017-09-01

    Collisions of heavy ions are carried out at ultra relativistic speeds at the Relativistic Heavy Ion Collider and the Large Hadron Collider to create Quark Gluon Plasma. The earliest stages of such collisions are dominated by the dynamics of classical gluon fields. The McLerran-Venugopalan (MV) model of color glass condensate provides a model for this process. Previous research has provided an analytic solution for event averaged observables in the MV model. Using the High Performance Research Computing Center (HPRC) at Texas A&M, we have developed a C++ code to explicitly calculate the initial gluon fields and energy momentum tensor event by event using the analytic recursive solution. The code has been tested against previously known analytic results up to fourth order. We have also have been able to test the convergence of the recursive solution at high orders in time and studied the time evolution of color glass condensate.

  20. Molecular detection of Borrelia burgdorferi sensu lato – An analytical comparison of real-time PCR protocols from five different Scandinavian laboratories

    PubMed Central

    Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.

    2017-01-01

    Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997

  1. Simultaneous targeted analysis of trimethylamine-N-oxide, choline, betaine, and carnitine by high performance liquid chromatography tandem mass spectrometry.

    PubMed

    Liu, Jia; Zhao, Mingming; Zhou, Juntuo; Liu, Changjie; Zheng, Lemin; Yin, Yuxin

    2016-11-01

    Trimethylamine-N-oxide (TMAO) is a metabolite generated from choline, betaine and carnitine in a gut microbiota-dependent way. This molecule is associated with development of atherosclerosis and cardiovascular events. A sensitive liquid chromatographic electrospray ionization tandem mass spectrometry (LC-ESI-MS/MS) has been developed and validated for the simultaneous determination of TMAO related molecules including TMAO, betaine, choline, and carnitine in mouse plasma. Analytes are extracted after protein precipitation by methanol and subjected to LC-ESI-MS/MS without preliminary derivatization. Separation of analytes was achieved on an amide column with acetonitrile-water as the mobile phase. This method has been fully validated in this study in terms of selectivity, linearity, sensitivity, precision, accuracy, and carryover effect, and the stability of the analyte under various conditions has been confirmed. This developed method has successfully been applied to plasma samples of our mouse model. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Radioactive Waste Characterization Strategies; Comparisons Between AK/PK, Dose to Curie Modeling, Gamma Spectroscopy, and Laboratory Analysis Methods- 12194

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singledecker, Steven J.; Jones, Scotty W.; Dorries, Alison M.

    2012-07-01

    In the coming fiscal years of potentially declining budgets, Department of Energy facilities such as the Los Alamos National Laboratory (LANL) will be looking to reduce the cost of radioactive waste characterization, management, and disposal processes. At the core of this cost reduction process will be choosing the most cost effective, efficient, and accurate methods of radioactive waste characterization. Central to every radioactive waste management program is an effective and accurate waste characterization program. Choosing between methods can determine what is classified as low level radioactive waste (LLRW), transuranic waste (TRU), waste that can be disposed of under an Authorizedmore » Release Limit (ARL), industrial waste, and waste that can be disposed of in municipal landfills. The cost benefits of an accurate radioactive waste characterization program cannot be overstated. In addition, inaccurate radioactive waste characterization of radioactive waste can result in the incorrect classification of radioactive waste leading to higher disposal costs, Department of Transportation (DOT) violations, Notice of Violations (NOVs) from Federal and State regulatory agencies, waste rejection from disposal facilities, loss of operational capabilities, and loss of disposal options. Any one of these events could result in the program that mischaracterized the waste losing its ability to perform it primary operational mission. Generators that produce radioactive waste have four characterization strategies at their disposal: - Acceptable Knowledge/Process Knowledge (AK/PK); - Indirect characterization using a software application or other dose to curie methodologies; - Non-Destructive Analysis (NDA) tools such as gamma spectroscopy; - Direct sampling (e.g. grab samples or Surface Contaminated Object smears) and laboratory analytical; Each method has specific advantages and disadvantages. This paper will evaluate each method detailing those advantages and disadvantages including; - Cost benefit analysis (basic materials costs, overall program operations costs, man-hours per sample analyzed, etc.); - Radiation Exposure As Low As Reasonably Achievable (ALARA) program considerations; - Industrial Health and Safety risks; - Overall Analytical Confidence Level. The concepts in this paper apply to any organization with significant radioactive waste characterization and management activities working to within budget constraints and seeking to optimize their waste characterization strategies while reducing analytical costs. (authors)« less

  3. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Nonlinear Analyte Concentration Gradients for One-Step Kinetic Analysis Employing Optical Microring Resonators

    PubMed Central

    Marty, Michael T.; Kuhnline Sloan, Courtney D.; Bailey, Ryan C.; Sligar, Stephen G.

    2012-01-01

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics. PMID:22686186

  5. Nonlinear analyte concentration gradients for one-step kinetic analysis employing optical microring resonators.

    PubMed

    Marty, Michael T; Sloan, Courtney D Kuhnline; Bailey, Ryan C; Sligar, Stephen G

    2012-07-03

    Conventional methods to probe the binding kinetics of macromolecules at biosensor surfaces employ a stepwise titration of analyte concentrations and measure the association and dissociation to the immobilized ligand at each concentration level. It has previously been shown that kinetic rates can be measured in a single step by monitoring binding as the analyte concentration increases over time in a linear gradient. We report here the application of nonlinear analyte concentration gradients for determining kinetic rates and equilibrium binding affinities in a single experiment. A versatile nonlinear gradient maker is presented, which is easily applied to microfluidic systems. Simulations validate that accurate kinetic rates can be extracted for a wide range of association and dissociation rates, gradient slopes, and curvatures, and with models for mass transport. The nonlinear analyte gradient method is demonstrated with a silicon photonic microring resonator platform to measure prostate specific antigen-antibody binding kinetics.

  6. Characterizing student navigation in educational multiuser virtual environments: A case study using data from the River City project

    NASA Astrophysics Data System (ADS)

    Dukas, Georg

    Though research in emerging technologies is vital to fulfilling their incredible potential for educational applications, it is often fraught with analytic challenges related to large datasets. This thesis explores these challenges in researching multiuser virtual environments (MUVEs). In a MUVE, users assume a persona and traverse a virtual space often depicted as a physical world, interacting with other users and digital artifacts. As students participate in MUVE-based curricula, detailed records of their paths through the virtual world are typically collected in event logs. Although many studies have demonstrated the instructional power of MUVEs (e.g., Barab, Hay, Barnett, & Squire, 2001; Ketelhut, Dede, Clarke, Nelson, & Bowman, 2008), none have successfully quantified these student paths for analysis in the aggregate. This thesis constructs several frameworks for conducting research involving student navigational choices in MUVEs based on a case study of data generated from the River City project. After providing a context for the research and an introduction to the River City dataset, the first part of this thesis explores the issues associated with data compression and presents a grounded theory approach (Glaser & Strauss, 1967) to the cleaning, compacting, and coding or MUVE datasets. In summary of this section, I discuss the implication of preparation choices for further analysis. Second, two conceptually different approaches to analyzing behavioral sequences are investigated. For each approach, a theoretical context, description of possible exploratory and confirmatory methods, and illustrative examples from River City are provided. The thesis then situates these specific analytic approaches within the constellation of possible research utilizing MUVE event log data. Finally, based on the lessons of River City and the investigation of a spectrum of possible event logs, a set of design heuristics for data collection in MUVEs is constructed and a possible future for research in these environments is envisioned.

  7. Methodological Caveats in the Detection of Coordinated Replay between Place Cells and Grid Cells

    PubMed Central

    Trimper, John B.; Trettel, Sean G.; Hwaun, Ernie; Colgin, Laura Lee

    2017-01-01

    At rest, hippocampal “place cells,” neurons with receptive fields corresponding to specific spatial locations, reactivate in a manner that reflects recently traveled trajectories. These “replay” events have been proposed as a mechanism underlying memory consolidation, or the transfer of a memory representation from the hippocampus to neocortical regions associated with the original sensory experience. Accordingly, it has been hypothesized that hippocampal replay of a particular experience should be accompanied by simultaneous reactivation of corresponding representations in the neocortex and in the entorhinal cortex, the primary interface between the hippocampus and the neocortex. Recent studies have reported that coordinated replay may occur between hippocampal place cells and medial entorhinal cortex grid cells, cells with multiple spatial receptive fields. Assessing replay in grid cells is problematic, however, as the cells exhibit regularly spaced spatial receptive fields in all environments and, therefore, coordinated replay between place cells and grid cells may be detected by chance. In the present report, we adapted analytical approaches utilized in recent studies of grid cell and place cell replay to determine the extent to which coordinated replay is spuriously detected between grid cells and place cells recorded from separate rats. For a subset of the employed analytical methods, coordinated replay was detected spuriously in a significant proportion of cases in which place cell replay events were randomly matched with grid cell firing epochs of equal duration. More rigorous replay evaluation procedures and minimum spike count requirements greatly reduced the amount of spurious findings. These results provide insights into aspects of place cell and grid cell activity during rest that contribute to false detection of coordinated replay. The results further emphasize the need for careful controls and rigorous methods when testing the hypothesis that place cells and grid cells exhibit coordinated replay. PMID:28824388

  8. Selected field and analytical methods and analytical results in the Dutch Flats area, western Nebraska, 1995-99

    USGS Publications Warehouse

    Verstraeten, Ingrid M.; Steele, G.V.; Cannia, J.C.; Bohlke, J.K.; Kraemer, T.E.; Hitch, D.E.; Wilson, K.E.; Carnes, A.E.

    2001-01-01

    A study of the water resources of the Dutch Flats area in the western part of the North Platte Natural Resources District, western Nebraska, was conducted from 1995 through 1999 to describe the surface water and hydrogeology, the spatial distribution of selected water-quality constituents in surface and ground water, and the surface-water/ground-water interaction in selected areas. This report describes the selected field and analytical methods used in the study and selected analytical results from the study not previously published. Specifically, dissolved gases, age-dating data, and other isotopes collected as part of an intensive sampling effort in August and November 1998 and all uranium and uranium isotope data collected through the course of this study are included in the report.

  9. Optimization of the solvent-based dissolution method to sample volatile organic compound vapors for compound-specific isotope analysis.

    PubMed

    Bouchard, Daniel; Wanner, Philipp; Luo, Hong; McLoughlin, Patrick W; Henderson, James K; Pirkle, Robert J; Hunkeler, Daniel

    2017-10-20

    The methodology of the solvent-based dissolution method used to sample gas phase volatile organic compounds (VOC) for compound-specific isotope analysis (CSIA) was optimized to lower the method detection limits for TCE and benzene. The sampling methodology previously evaluated by [1] consists in pulling the air through a solvent to dissolve and accumulate the gaseous VOC. After the sampling process, the solvent can then be treated similarly as groundwater samples to perform routine CSIA by diluting an aliquot of the solvent into water to reach the required concentration of the targeted contaminant. Among solvents tested, tetraethylene glycol dimethyl ether (TGDE) showed the best aptitude for the method. TGDE has a great affinity with TCE and benzene, hence efficiently dissolving the compounds during their transition through the solvent. The method detection limit for TCE (5±1μg/m 3 ) and benzene (1.7±0.5μg/m 3 ) is lower when using TGDE compared to methanol, which was previously used (385μg/m 3 for TCE and 130μg/m 3 for benzene) [2]. The method detection limit refers to the minimal gas phase concentration in ambient air required to load sufficient VOC mass into TGDE to perform δ 13 C analysis. Due to a different analytical procedure, the method detection limit associated with δ 37 Cl analysis was found to be 156±6μg/m 3 for TCE. Furthermore, the experimental results validated the relationship between the gas phase TCE and the progressive accumulation of dissolved TCE in the solvent during the sampling process. Accordingly, based on the air-solvent partitioning coefficient, the sampling methodology (e.g. sampling rate, sampling duration, amount of solvent) and the final TCE concentration in the solvent, the concentration of TCE in the gas phase prevailing during the sampling event can be determined. Moreover, the possibility to analyse for TCE concentration in the solvent after sampling (or other targeted VOCs) allows the field deployment of the sampling method without the need to determine the initial gas phase TCE concentration. The simplified field deployment approach of the solvent-based dissolution method combined with the conventional analytical procedure used for groundwater samples substantially facilitates the application of CSIA to gas phase studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  11. Rapid analytical methods for on-site triage for traumatic brain injury.

    PubMed

    North, Stella H; Shriver-Lake, Lisa C; Taitt, Chris R; Ligler, Frances S

    2012-01-01

    Traumatic brain injury (TBI) results from an event that causes rapid acceleration and deceleration of the brain or penetration of the skull with an object. Responses to stimuli and questions, loss of consciousness, and altered behavior are symptoms currently used to justify brain imaging for diagnosis and therapeutic guidance. Tests based on such symptoms are susceptible to false-positive and false-negative results due to stress, fatigue, and medications. Biochemical markers of neuronal damage and the physiological response to that damage are being identified. Biosensors capable of rapid measurement of such markers in the circulation offer a solution for on-site triage, as long as three criteria are met: (a) Recognition reagents can be identified that are sufficiently sensitive and specific, (b) the biosensor can provide quantitative assessment of multiple markers rapidly and simultaneously, and (c) both the sensor and reagents are designed for use outside the laboratory.

  12. Rapid Analytical Methods for On-Site Triage for Traumatic Brain Injury

    NASA Astrophysics Data System (ADS)

    North, Stella H.; Shriver-Lake, Lisa C.; Taitt, Chris R.; Ligler, Frances S.

    2012-07-01

    Traumatic brain injury (TBI) results from an event that causes rapid acceleration and deceleration of the brain or penetration of the skull with an object. Responses to stimuli and questions, loss of consciousness, and altered behavior are symptoms currently used to justify brain imaging for diagnosis and therapeutic guidance. Tests based on such symptoms are susceptible to false-positive and false-negative results due to stress, fatigue, and medications. Biochemical markers of neuronal damage and the physiological response to that damage are being identified. Biosensors capable of rapid measurement of such markers in the circulation offer a solution for on-site triage, as long as three criteria are met: (a) Recognition reagents can be identified that are sufficiently sensitive and specific, (b) the biosensor can provide quantitative assessment of multiple markers rapidly and simultaneously, and (c) both the sensor and reagents are designed for use outside the laboratory.

  13. A multiplex degenerate PCR analytical approach targeting to eight genes for screening GMOs.

    PubMed

    Guo, Jinchao; Chen, Lili; Liu, Xin; Gao, Ying; Zhang, Dabing; Yang, Litao

    2012-06-01

    Currently, the detection methods with lower cost and higher throughput are the major trend in screening genetically modified (GM) food or feed before specific identification. In this study, we developed a quadruplex degenerate PCR screening approach for more than 90 approved GMO events. This assay is consisted of four PCR systems targeting on nine DNA sequences from eight trait genes widely introduced into GMOs, such as CP4-EPSPS derived from Acetobacterium tumefaciens sp. strain CP4, phosphinothricin acetyltransferase gene derived from Streptomyceshygroscopicus (bar) and Streptomyces viridochromogenes (pat), and Cry1Ab, Cry1Ac, Cry1A(b/c), mCry3A, and Cry3Bb1 derived from Bacillus thuringiensis. The quadruplex degenerate PCR assay offers high specificity and sensitivity with the absolute limit of detection (LOD) of approximate 80targetcopies. Furthermore, the applicability of the quadruplex PCR assay was confirmed by screening either several artificially prepared samples or samples of Grain Inspection, Packers and Stockyards Administration (GIPSA) proficiency program. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. When an event sparks behavior change: an introduction to the sentinel event method of dynamic model building and its application to emergency medicine.

    PubMed

    Boudreaux, Edwin D; Bock, Beth; O'Hea, Erin

    2012-03-01

    Experiencing a negative consequence related to one's health behavior, like a medical problem leading to an emergency department (ED) visit, can promote behavior change, giving rise to the popular concept of the "teachable moment." However, the mechanisms of action underlying this process of change have received scant attention. In particular, most existing health behavior theories are limited in explaining why such events can inspire short-term change in some and long-term change in others. Expanding on recommendations published in the 2009 Academic Emergency Medicine consensus conference on public health in emergency medicine (EM), we propose a new method for developing conceptual models that explain how negative events, like medical emergencies, influence behavior change, called the Sentinel Event Method. The method itself is atheoretical; instead, it defines steps to guide investigations that seek to relate specific consequences or events to specific health behaviors. This method can be used to adapt existing health behavior theories to study the event-behavior change relationship or to guide formulation of completely new conceptual models. This paper presents the tenets underlying the Sentinel Event Method, describes the steps comprising the process, and illustrates its application to EM through an example of a cardiac-related ED visit and tobacco use. © 2012 by the Society for Academic Emergency Medicine.

  15. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  16. Fault tree analysis: NiH2 aerospace cells for LEO mission

    NASA Technical Reports Server (NTRS)

    Klein, Glenn C.; Rash, Donald E., Jr.

    1992-01-01

    The Fault Tree Analysis (FTA) is one of several reliability analyses or assessments applied to battery cells to be utilized in typical Electric Power Subsystems for spacecraft in low Earth orbit missions. FTA is generally the process of reviewing and analytically examining a system or equipment in such a way as to emphasize the lower level fault occurrences which directly or indirectly contribute to the major fault or top level event. This qualitative FTA addresses the potential of occurrence for five specific top level events: hydrogen leakage through either discrete leakage paths or through pressure vessel rupture; and four distinct modes of performance degradation - high charge voltage, suppressed discharge voltage, loss of capacity, and high pressure.

  17. Aberrantly methylated DNA as a biomarker in breast cancer.

    PubMed

    Kristiansen, Søren; Jørgensen, Lars M; Guldberg, Per; Sölétormos, György

    2013-01-01

    Aberrant DNA hypermethylation at gene promoters is a frequent event in human breast cancer. Recent genome-wide studies have identified hundreds of genes that exhibit differential methylation between breast cancer cells and normal breast tissue. Due to the tumor-specific nature of DNA hypermethylation events, their use as tumor biomarkers is usually not hampered by analytical signals from normal cells, which is a general problem for existing protein tumor markers used for clinical assessment of breast cancer. There is accumulating evidence that DNA-methylation changes in breast cancer patients occur early during tumorigenesis. This may open up for effective screening, and analysis of blood or nipple aspirate may later help in diagnosing breast cancer. As a more detailed molecular characterization of different types of breast cancer becomes available, the ability to divide patients into subgroups based on DNA biomarkers may improve prognosis. Serial monitoring of DNA-methylation markers in blood during treatment may be useful, particularly when the cancer burden is below the detection level for standard imaging techniques. Overall, aberrant DNA methylation has a great potential as a versatile biomarker tool for screening, diagnosis, prognosis and monitoring of breast cancer. Standardization of methods and biomarker panels will be required to fully exploit this clinical potential.

  18. Group specific internal standard technology (GSIST) for simultaneous identification and quantification of small molecules

    DOEpatents

    Adamec, Jiri; Yang, Wen-Chu; Regnier, Fred E

    2014-01-14

    Reagents and methods are provided that permit simultaneous analysis of multiple diverse small molecule analytes present in a complex mixture. Samples are labeled with chemically identical but isotopically distince forms of the labeling reagent, and analyzed using mass spectrometry. A single reagent simultaneously derivatizes multiple small molecule analytes having different reactive functional groups.

  19. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  20. The Optimizer Topology Characteristics in Seismic Hazards

    NASA Astrophysics Data System (ADS)

    Sengor, T.

    2015-12-01

    The characteristic data of the natural phenomena are questioned in a topological space approach to illuminate whether there is an algorithm behind them bringing the situation of physics of phenomena to optimized states even if they are hazards. The optimized code designing the hazard on a topological structure mashes the metric of the phenomena. The deviations in the metric of different phenomena push and/or pull the fold of the other suitable phenomena. For example if the metric of a specific phenomenon A fits to the metric of another specific phenomenon B after variation processes generated with the deviation of the metric of previous phenomenon A. Defining manifold processes covering the metric characteristics of each of every phenomenon is possible for all the physical events; i.e., natural hazards. There are suitable folds in those manifold groups so that each subfold fits to the metric characteristics of one of the natural hazard category at least. Some variation algorithms on those metric structures prepare a gauge effect bringing the long time stability of Earth for largely scaled periods. The realization of that stability depends on some specific conditions. These specific conditions are called optimized codes. The analytical basics of processes in topological structures are developed in [1]. The codes are generated according to the structures in [2]. Some optimized codes are derived related to the seismicity of NAF beginning from the quakes of the year 1999. References1. Taner SENGOR, "Topological theory and analytical configuration for a universal community model," Procedia- Social and Behavioral Sciences, Vol. 81, pp. 188-194, 28 June 2013, 2. Taner SENGOR, "Seismic-Climatic-Hazardous Events Estimation Processes via the Coupling Structures in Conserving Energy Topologies of the Earth," The 2014 AGU Fall Meeting, Abstract no.: 31374, ABD.

  1. Dark matter in 3D

    DOE PAGES

    Alves, Daniele S. M.; El Hedri, Sonia; Wacker, Jay G.

    2016-03-21

    We discuss the relevance of directional detection experiments in the post-discovery era and propose a method to extract the local dark matter phase space distribution from directional data. The first feature of this method is a parameterization of the dark matter distribution function in terms of integrals of motion, which can be analytically extended to infer properties of the global distribution if certain equilibrium conditions hold. The second feature of our method is a decomposition of the distribution function in moments of a model independent basis, with minimal reliance on the ansatz for its functional form. We illustrate our methodmore » using the Via Lactea II N-body simulation as well as an analytical model for the dark matter halo. Furthermore, we conclude that O(1000) events are necessary to measure deviations from the Standard Halo Model and constrain or measure the presence of anisotropies.« less

  2. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    NASA Astrophysics Data System (ADS)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  3. Ammonia Monitor

    NASA Technical Reports Server (NTRS)

    Sauer, Richard L. (Inventor); Akse, James R. (Inventor); Thompson, John O. (Inventor); Atwater, James E. (Inventor)

    1999-01-01

    Ammonia monitor and method of use are disclosed. A continuous, real-time determination of the concentration of ammonia in an aqueous process stream is possible over a wide dynamic range of concentrations. No reagents are required because pH is controlled by an in-line solid-phase base. Ammonia is selectively transported across a membrane from the process stream to an analytical stream to an analytical stream under pH control. The specific electrical conductance of the analytical stream is measured and used to determine the concentration of ammonia.

  4. Reactive ion etched substrates and methods of making and using

    DOEpatents

    Rucker, Victor C [San Francisco, CA; Shediac, Rene [Oakland, CA; Simmons, Blake A [San Francisco, CA; Havenstrite, Karen L [New York, NY

    2007-08-07

    Disclosed herein are substrates comprising reactive ion etched surfaces and specific binding agents immobilized thereon. The substrates may be used in methods and devices for assaying or isolating analytes in a sample. Also disclosed are methods of making the reactive ion etched surfaces.

  5. EzyAmp signal amplification cascade enables isothermal detection of nucleic acid and protein targets.

    PubMed

    Linardy, Evelyn M; Erskine, Simon M; Lima, Nicole E; Lonergan, Tina; Mokany, Elisa; Todd, Alison V

    2016-01-15

    Advancements in molecular biology have improved the ability to characterize disease-related nucleic acids and proteins. Recently, there has been an increasing desire for tests that can be performed outside of centralised laboratories. This study describes a novel isothermal signal amplification cascade called EzyAmp (enzymatic signal amplification) that is being developed for detection of targets at the point of care. EzyAmp exploits the ability of some restriction endonucleases to cleave substrates containing nicks within their recognition sites. EzyAmp uses two oligonucleotide duplexes (partial complexes 1 and 2) which are initially cleavage-resistant as they lack a complete recognition site. The recognition site of partial complex 1 can be completed by hybridization of a triggering oligonucleotide (Driver Fragment 1) that is generated by a target-specific initiation event. Binding of Driver Fragment 1 generates a completed complex 1, which upon cleavage, releases Driver Fragment 2. In turn, binding of Driver Fragment 2 to partial complex 2 creates completed complex 2 which when cleaved releases additional Driver Fragment 1. Each cleavage event separates fluorophore quencher pairs resulting in an increase in fluorescence. At this stage a cascade of signal production becomes independent of further target-specific initiation events. This study demonstrated that the EzyAmp cascade can facilitate detection and quantification of nucleic acid targets with sensitivity down to aM concentration. Further, the same cascade detected VEGF protein with a sensitivity of 20nM showing that this universal method for amplifying signal may be linked to the detection of different types of analytes in an isothermal format. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  7. Electromigrative separation techniques in forensic science: combining selectivity, sensitivity, and robustness.

    PubMed

    Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin

    2015-01-01

    In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.

  8. Highly Accurate Analytical Approximate Solution to a Nonlinear Pseudo-Oscillator

    NASA Astrophysics Data System (ADS)

    Wu, Baisheng; Liu, Weijia; Lim, C. W.

    2017-07-01

    A second-order Newton method is presented to construct analytical approximate solutions to a nonlinear pseudo-oscillator in which the restoring force is inversely proportional to the dependent variable. The nonlinear equation is first expressed in a specific form, and it is then solved in two steps, a predictor and a corrector step. In each step, the harmonic balance method is used in an appropriate manner to obtain a set of linear algebraic equations. With only one simple second-order Newton iteration step, a short, explicit, and highly accurate analytical approximate solution can be derived. The approximate solutions are valid for all amplitudes of the pseudo-oscillator. Furthermore, the method incorporates second-order Taylor expansion in a natural way, and it is of significant faster convergence rate.

  9. Sampling and physico-chemical analysis of precipitation: a review.

    PubMed

    Krupa, Sagar V

    2002-01-01

    Wet deposition is one of two processes governing the transfer of beneficial and toxic chemicals from the atmosphere on to surfaces. Since the early 1970s, numerous investigators have sampled and analyzed precipitation for their chemical constituents, in the context of "acidic rain" and related atmospheric processes. Since then, significant advances have been made in our understanding of how to sample rain, cloud and fog water to preserve their physico-chemical integrity prior to analyses. Since the 1970s large-scale precipitation sampling networks have been in operation to broadly address regional and multi-regional issues. However, in examining the results from such efforts at a site-specific level, concerns have been raised about the accuracy and precision of the information gathered. There is mounting evidence to demonstrate the instability of precipitation samples (e.g. with N species) that have been subjected to prolonged ambient or field conditions. At the present time precipitation sampling procedures allow unrefrigerated or refrigerated collection of wet deposition from individual events, sequential fractions within events, in situ continuous chemical analyses in the field and even sampling of single or individual rain, cloud and fog droplets. Similarly analytical procedures of precipitation composition have advanced from time-consuming methods to rapid and simultaneous analyses of major anions and cations, from bulk samples to single droplets. For example, analytical techniques have evolved from colorimetry to ion chromatography to capillary electrophoresis. Overall, these advances allow a better understanding of heterogeneous reactions and atmospheric pollutant scavenging processes by precipitation. In addition, from an environmental perspective, these advances allow better quantification of semi-labile (e.g. NH4+, frequently its deposition values are underestimated) or labile species [e.g. S (IV)] in precipitation and measurements of toxic chemicals such as Hg and PCBs (polychlorinated biphenyls). Similarly, methods now exist for source-receptor studies, using for example, the characterization of reduced elemental states and/or the use of stable isotopes in precipitation as tracers. Future studies on the relationship between atmospheric deposition and environmental impacts must exploit these advances. This review provides a comprehensive and comparative treatment of the state of the art sampling methods of precipitation and its physico-chemical analysis.

  10. Reference materials for cellular therapeutics.

    PubMed

    Bravery, Christopher A; French, Anna

    2014-09-01

    The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  11. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment.

    PubMed

    Streby, Ashleigh; Mull, Bonnie J; Levy, Karen; Hill, Vincent R

    2015-05-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Foursuch assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices.

  12. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment

    PubMed Central

    Streby, Ashleigh; Mull, Bonnie J.; Levy, Karen

    2015-01-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Four such assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices. PMID:25855343

  13. Isotope dilution liquid chromatography - mass spectrometry methods for fat- and water-soluble vitamins in nutritional formulations.

    PubMed

    Phinney, Karen W; Rimmer, Catherine A; Thomas, Jeanice Brown; Sander, Lane C; Sharpless, Katherine E; Wise, Stephen A

    2011-01-01

    Vitamins are essential to human health, and dietary supplements containing vitamins are widely used by individuals hoping to ensure they have adequate intake of these important nutrients. Measurement of vitamins in nutritional formulations is necessary to monitor regulatory compliance and in studies examining the nutrient intake of specific populations. Liquid chromatographic methods, primarily with UV absorbance detection, are well established for both fat- and water-soluble measurements, but they do have limitations for certain analytes and may suffer from a lack of specificity in complex matrices. Liquid chromatography-mass spectrometry (LC-MS) provides both sensitivity and specificity for the determination of vitamins in these matrices, and simultaneous analysis of multiple vitamins in a single analysis is often possible. In this work, LC-MS methods were developed for both fat- and water-soluble vitamins and applied to the measurement of these analytes in two NIST Standard Reference Materials. When possible, stable isotope labeled internal standards were employed for quantification.

  14. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  15. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  16. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  17. 42 CFR 493.959 - Immunohematology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...

  18. Proposal for a model to assess the effect of seismic activity on the triggering of debris flows

    NASA Astrophysics Data System (ADS)

    Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Luna, Byron Quan; Nadim, Farrokh

    2013-04-01

    Landslide triggered by earthquakes is a serious threat for many communities around the world, and in some cases is known to have caused 25-50% of the earthquake fatalities. Seismic shaking can contribute to the triggering of debris flows either during the seismic event or indirectly by increasing the susceptibility of the slope to debris flow during intense rainfall in a period after the seismic event. The paper proposes a model to quantify both these effects. The model is based on an infinite slope formulation where precipitation and earthquakes influence the slope stability as follows: (1) During the shaking, the factor of safety is reduced due to cyclic pore pressure build-up where the cyclic pore pressure is modelled as a function of earthquake duration and intensity (measured as number of equivalent shear stress cycles and cyclic shear stress magnitude) and in-situ soil conditions (measured as average normalised shear stress). The model is calibrated using cyclic triaxial and direct simple shear (DSS) test data on clay and sand. (2) After the shaking, the factor of safety is modified using a combined empirical and analytical model that links observed earthquake induced changes in rainfall thresholds for triggering of debris flow to an equivalent reduction in soil shear strength. The empirical part uses data from past earthquakes to propose a conceptual model linking a site-specific reduction factor for rainfall intensity threshold (needed to trigger debris flows) to earthquake magnitude, distance from the epicentre and time period after the earthquake. The analytical part is a hydrological model for transient rainfall infiltration into an infinite slope in order to translate the change in rainfall intensity threshold into an equivalent reduction in soil shear strength. This is generalised into a functional form giving a site-specific shear strength reduction factor as function of earthquake history and soil conditions. The model is suitable for hazard and risk assessment at local and regional scale for earthquake and rainfall induced landslide. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  19. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  20. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    PubMed

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  1. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data

    PubMed Central

    Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria

    2017-01-01

    Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372

  2. A mass spectrometry primer for mass spectrometry imaging

    PubMed Central

    Rubakhin, Stanislav S.; Sweedler, Jonathan V.

    2011-01-01

    Mass spectrometry imaging (MSI), a rapidly growing subfield of chemical imaging, employs mass spectrometry (MS) technologies to create single- and multi-dimensional localization maps for a variety of atoms and molecules. Complimentary to other imaging approaches, MSI provides high chemical specificity and broad analyte coverage. This powerful analytical toolset is capable of measuring the distribution of many classes of inorganics, metabolites, proteins and pharmaceuticals in chemically and structurally complex biological specimens in vivo, in vitro, and in situ. The MSI approaches highlighted in this Methods in Molecular Biology volume provide flexibility of detection, characterization, and identification of multiple known and unknown analytes. The goal of this chapter is to introduce investigators who may be unfamiliar with MS to the basic principles of the mass spectrometric approaches as used in MSI. In addition to guidelines for choosing the most suitable MSI method for specific investigations, cross-references are provided to the chapters in this volume that describe the appropriate experimental protocols. PMID:20680583

  3. Processing mode during repetitive thinking in socially anxious individuals: evidence for a maladaptive experiential mode.

    PubMed

    Wong, Quincy J J; Moulds, Michelle L

    2012-12-01

    Evidence from the depression literature suggests that an analytical processing mode adopted during repetitive thinking leads to maladaptive outcomes relative to an experiential processing mode. To date, in socially anxious individuals, the impact of processing mode during repetitive thinking related to an actual social-evaluative situation has not been investigated. We thus tested whether an analytical processing mode would be maladaptive relative to an experiential processing mode during anticipatory processing and post-event rumination. High and low socially anxious participants were induced to engage in either an analytical or experiential processing mode during: (a) anticipatory processing before performing a speech (Experiment 1; N = 94), or (b) post-event rumination after performing a speech (Experiment 2; N = 74). Mood, cognition, and behavioural measures were employed to examine the effects of processing mode. For high socially anxious participants, the modes had a similar effect on self-reported anxiety during both anticipatory processing and post-event rumination. Unexpectedly, relative to the analytical mode, the experiential mode led to stronger high standard and conditional beliefs during anticipatory processing, and stronger unconditional beliefs during post-event rumination. These experiments are the first to investigate processing mode during anticipatory processing and post-event rumination. Hence, these results are novel and will need to be replicated. These findings suggest that an experiential processing mode is maladaptive relative to an analytical processing mode during repetitive thinking characteristic of socially anxious individuals. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Methods for detection of GMOs in food and feed.

    PubMed

    Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca

    2008-10-01

    This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.

  5. Molecular counting of membrane receptor subunits with single-molecule localization microscopy

    NASA Astrophysics Data System (ADS)

    Krüger, Carmen; Fricke, Franziska; Karathanasis, Christos; Dietz, Marina S.; Malkusch, Sebastian; Hummer, Gerhard; Heilemann, Mike

    2017-02-01

    We report on quantitative single-molecule localization microscopy, a method that next to super-resolved images of cellular structures provides information on protein copy numbers in protein clusters. This approach is based on the analysis of blinking cycles of single fluorophores, and on a model-free description of the distribution of the number of blinking events. We describe the experimental and analytical procedures, present cellular data of plasma membrane proteins and discuss the applicability of this method.

  6. GMOMETHODS: the European Union database of reference methods for GMO analysis.

    PubMed

    Bonfini, Laura; Van den Bulcke, Marc H; Mazzara, Marco; Ben, Enrico; Patak, Alexandre

    2012-01-01

    In order to provide reliable and harmonized information on methods for GMO (genetically modified organism) analysis we have published a database called "GMOMETHODS" that supplies information on PCR assays validated according to the principles and requirements of ISO 5725 and/or the International Union of Pure and Applied Chemistry protocol. In addition, the database contains methods that have been verified by the European Union Reference Laboratory for Genetically Modified Food and Feed in the context of compliance with an European Union legislative act. The web application provides search capabilities to retrieve primers and probes sequence information on the available methods. It further supplies core data required by analytical labs to carry out GM tests and comprises information on the applied reference material and plasmid standards. The GMOMETHODS database currently contains 118 different PCR methods allowing identification of 51 single GM events and 18 taxon-specific genes in a sample. It also provides screening assays for detection of eight different genetic elements commonly used for the development of GMOs. The application is referred to by the Biosafety Clearing House, a global mechanism set up by the Cartagena Protocol on Biosafety to facilitate the exchange of information on Living Modified Organisms. The publication of the GMOMETHODS database can be considered an important step toward worldwide standardization and harmonization in GMO analysis.

  7. Detection methods and performance criteria for genetically modified organisms.

    PubMed

    Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly

    2002-01-01

    Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.

  8. Bayesian Phase II optimization for time-to-event data based on historical information.

    PubMed

    Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard

    2017-01-01

    After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.

  9. Practicable group testing method to evaluate weight/weight GMO content in maize grains.

    PubMed

    Mano, Junichi; Yanaka, Yuka; Ikezu, Yoko; Onishi, Mari; Futo, Satoshi; Minegishi, Yasutaka; Ninomiya, Kenji; Yotsuyanagi, Yuichi; Spiegelhalter, Frank; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Naito, Shigehiro; Koiwa, Tomohiro; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi

    2011-07-13

    Because of the increasing use of maize hybrids with genetically modified (GM) stacked events, the established and commonly used bulk sample methods for PCR quantification of GM maize in non-GM maize are prone to overestimate the GM organism (GMO) content, compared to the actual weight/weight percentage of GM maize in the grain sample. As an alternative method, we designed and assessed a group testing strategy in which the GMO content is statistically evaluated based on qualitative analyses of multiple small pools, consisting of 20 maize kernels each. This approach enables the GMO content evaluation on a weight/weight basis, irrespective of the presence of stacked-event kernels. To enhance the method's user-friendliness in routine application, we devised an easy-to-use PCR-based qualitative analytical method comprising a sample preparation step in which 20 maize kernels are ground in a lysis buffer and a subsequent PCR assay in which the lysate is directly used as a DNA template. This method was validated in a multilaboratory collaborative trial.

  10. Population-based absolute risk estimation with survey data

    PubMed Central

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  11. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Appendix 3 Summary of Field Sampling and Analytical Methods with Bibliography

    EPA Science Inventory

    Conductivity and Specific conductance are measures of the ability of water to conduct an electric current, and are a general measure of stream-water quality. Conductivity is affected by temperature, with warmer water having a greater conductivity. Specific conductance is the te...

  13. Perverse thought.

    PubMed

    Sánchez-Medina, Alfonso

    2002-12-01

    Based on Bion's work on the 'psychotic and non-psychotic parts of the personality', the author hypothesises the existence of a special type of thought disorder known as 'perverse thought'. First the author presents an overview of the major contributions to the concept of perversion that have a bearing on 'perverse thought'. These include Freud's splitting and disavowal concepts, Klein's projective identification concept, Bion's -K link and Meltzer's transference perversion. Then, by means of a case study and some vignettes, the author illustrates how this thought disorder is configured within the analytic process. The author focuses on three main aspects of this pathology: the specific modality of projective identification in a perverse scheme, the lie and some important clinical events that reveal an attack against knowledge through the formation of the -K link. Perverse thought is an important resistance mechanism in the analytic process. Its clarification is essential, given that its main objective is to attack the knowledge process, and therefore truth, in order to pervert the analytic relationship.

  14. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  15. Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.

  16. Development of a validated liquid chromatographic method for quantification of sorafenib tosylate in the presence of stress-induced degradation products and in biological matrix employing analytical quality by design approach.

    PubMed

    Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder

    2018-05-01

    The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Evaporative concentration on a paper-based device to concentrate analytes in a biological fluid.

    PubMed

    Wong, Sharon Y; Cabodi, Mario; Rolland, Jason; Klapperich, Catherine M

    2014-12-16

    We report the first demonstration of using heat on a paper device to rapidly concentrate a clinically relevant analyte of interest from a biological fluid. Our technology relies on the application of localized heat to a paper strip to evaporate off hundreds of microliters of liquid to concentrate the target analyte. This method can be used to enrich for a target analyte that is present at low concentrations within a biological fluid to enhance the sensitivity of downstream detection methods. We demonstrate our method by concentrating the tuberculosis-specific glycolipid, lipoarabinomannan (LAM), a promising urinary biomarker for the detection and diagnosis of tuberculosis. We show that the heat does not compromise the subsequent immunodetectability of LAM, and in 20 min, the tuberculosis biomarker was concentrated by nearly 20-fold in simulated urine. Our method requires only 500 mW of power, and sample flow is self-driven via capillary action. As such, our technology can be readily integrated into portable, battery-powered, instrument-free diagnostic devices intended for use in low-resource settings.

  18. ATLAS Eventlndex monitoring system using the Kibana analytics and visualization platform

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration

    2016-10-01

    The ATLAS EventIndex is a data catalogue system that stores event-related metadata for all (real and simulated) ATLAS events, on all processing stages. As it consists of different components that depend on other applications (such as distributed storage, and different sources of information) we need to monitor the conditions of many heterogeneous subsystems, to make sure everything is working correctly. This paper describes how we gather information about the EventIndex components and related subsystems: the Producer-Consumer architecture for data collection, health parameters from the servers that run EventIndex components, EventIndex web interface status, and the Hadoop infrastructure that stores EventIndex data. This information is collected, processed, and then displayed using CERN service monitoring software based on the Kibana analytic and visualization package, provided by CERN IT Department. EventIndex monitoring is used both by the EventIndex team and ATLAS Distributed Computing shifts crew.

  19. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  20. Workstation Analytics in Distributed Warfighting Experimentation: Results from Coalition Attack Guidance Experiment 3A

    DTIC Science & Technology

    2014-06-01

    central location. Each of the SQLite databases are converted and stored in one MySQL database and the pcap files are parsed to extract call information...from the specific communications applications used during the experiment. This extracted data is then stored in the same MySQL database. With all...rhythm of the event. Figure 3 demonstrates the application usage over the course of the experiment for the EXDIR. As seen, the EXDIR spent the majority

  1. Accurate quantification of PGE2 in the polyposis in rat colon (Pirc) model by surrogate analyte-based UPLC-MS/MS.

    PubMed

    Yun, Changhong; Dashwood, Wan-Mohaiza; Kwong, Lawrence N; Gao, Song; Yin, Taijun; Ling, Qinglan; Singh, Rashim; Dashwood, Roderick H; Hu, Ming

    2018-01-30

    An accurate and reliable UPLC-MS/MS method is reported for the quantification of endogenous Prostaglandin E2 (PGE 2 ) in rat colonic mucosa and polyps. This method adopted the "surrogate analyte plus authentic bio-matrix" approach, using two different stable isotopic labeled analogs - PGE 2 -d9 as the surrogate analyte and PGE 2 -d4 as the internal standard. A quantitative standard curve was constructed with the surrogate analyte in colonic mucosa homogenate, and the method was successfully validated with the authentic bio-matrix. Concentrations of endogenous PGE 2 in both normal and inflammatory tissue homogenates were back-calculated based on the regression equation. Because of no endogenous interference on the surrogate analyte determination, the specificity was particularly good. By using authentic bio-matrix for validation, the matrix effect and exaction recovery are identically same for the quantitative standard curve and actual samples - this notably increased the assay accuracy. The method is easy, fast, robust and reliable for colon PGE 2 determination. This "surrogate analyte" approach was applied to measure the Pirc (an Apc-mutant rat kindred that models human FAP) mucosa and polyps PGE 2 , one of the strong biomarkers of colorectal cancer. A similar concept could be applied to endogenous biomarkers in other tissues. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Mapping lexical-semantic networks and determining hemispheric language dominance: Do task design, sex, age, and language performance make a difference?

    PubMed

    Chang, Yu-Hsuan A; Javadi, Sogol S; Bahrami, Naeim; Uttarwar, Vedang S; Reyes, Anny; McDonald, Carrie R

    2018-04-01

    Blocked and event-related fMRI designs are both commonly used to localize language networks and determine hemispheric dominance in research and clinical settings. We compared activation profiles on a semantic monitoring task using one of the two designs in a total of 43 healthy individual to determine whether task design or subject-specific factors (i.e., age, sex, or language performance) influence activation patterns. We found high concordance between the two designs within core language regions, including the inferior frontal, posterior temporal, and basal temporal region. However, differences emerged within inferior parietal cortex. Subject-specific factors did not influence activation patterns, nor did they interact with task design. These results suggest that despite high concordance within perisylvian regions that are robust to subject-specific factors, methodological differences between blocked and event-related designs may contribute to parietal activations. These findings provide important information for researchers incorporating fMRI results into meta-analytic studies, as well as for clinicians using fMRI to guide pre-surgical planning. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Laboratory analytical methods for the determination of the hydrocarbon status of soils (a review)

    NASA Astrophysics Data System (ADS)

    Pikovskii, Yu. I.; Korotkov, L. A.; Smirnova, M. A.; Kovach, R. G.

    2017-10-01

    Laboratory analytical methods suitable for the determination of the hydrocarbon status of soils (a specific soil characteristic involving information on the total content and qualitative features of soluble (bitumoid) carbonaceous substances and individual hydrocarbons (polycyclic aromatic hydrocarbons, alkanes, etc.) in bitumoid, as well as the composition and content of hydrocarbon gases) have been considered. Among different physicochemical methods of study, attention is focused on the methods suitable for the wide use. Luminescence-bituminological analysis, low-temperature spectrofluorimetry (Shpolskii spectroscopy), infrared (IR) spectroscopy, gas chromatography, chromatography-mass spectrometry, and some other methods have been characterized, as well as sample preparation features. Advantages and limitations of each of these methods are described; their efficiency, instrumental complexity, analysis duration, and accuracy are assessed.

  4. Bringing a transgenic crop to market: where compositional analysis fits.

    PubMed

    Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine

    2013-09-04

    In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.

  5. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, T.; Adams, R. B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Preliminary results are presented for two methods to approximate the mission performance of high specific impulse high specific power vehicles. The first method is based on an analytical approximation derived by Williams and Shepherd and can be used to approximate mission performance to outer planets and interstellar space. The second method is based on a parametric analysis of trajectories created using the well known trajectory optimization code, VARITOP. This parametric analysis allows the reader to approximate payload ratios and optimal power requirements for both one-way and round-trip missions. While this second method only addresses missions to and from Jupiter, future work will encompass all of the outer planet destinations and some interstellar precursor missions.

  6. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  7. Determination of Selected Perfluorinated Alkyl Acids in ...

    EPA Pesticide Factsheets

    The 1996 amendments to the Safe Drinking Water Act (SDWA) required EPA to establish a Contaminant Candidate List (CCL), that contains a list of drinking water contaminants that the Agency will consider for future regulation. EPA must make a regulatory determination on a minimum of five contaminants every five years. The first CCL was published in 1998, and updates were anticipated every five years thereafter. One of the key pieces of information that must be available in order to make a regulatory determination is nationwide occurrence data for the chemical contaminants under consideration. Historically, EPA has collected the necessary occurrence data under its Unregulated Contaminant Monitoring Regulations (UCMR). Under the UCMR, monitoring is conducted at selected drinking water utilities for specific contaminants of interest. The chemical analyses are usually performed by the utilities or by commercial laboratories. To meet the requirements of monitoring under the UCMR program, the analytical methods developed should be specific, sensitive, and practical enough for application in commercial laboratories. This task will focus on the development of analytical methods for chemicals identified on future CCLs or emerging contaminants not yet listed on the CCL. These methods will be used for the collection of occurrence data under future UCMRs. The objective of this research effort is to develop analytical methods to be used to measure the occurrence of

  8. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  9. A simultaneous screening and quantitative method for the multiresidue analysis of pesticides in spices using ultra-high performance liquid chromatography-high resolution (Orbitrap) mass spectrometry.

    PubMed

    Goon, Arnab; Khan, Zareen; Oulkar, Dasharath; Shinde, Raviraj; Gaikwad, Suresh; Banerjee, Kaushik

    2018-01-12

    A novel screening and quantitation method is reported for non-target multiresidue analysis of pesticides using ultra-HPLC-quadrupole-Orbitrap mass spectrometry in spice matrices, including black pepper, cardamom, chili, coriander, cumin, and turmeric. The method involved sequential full-scan (resolution = 70,000), and variable data independent acquisition (vDIA) with nine consecutive fragmentation events (resolution = 17,500). Samples were extracted by the QuEChERS method. The introduction of an SPE-based clean-up step through hydrophilic-lipophilic-balance (HLB) cartridges proved advantageous in minimizing the false negatives. For coriander, cumin, chili, and cardamom, the screening detection limit was largely at 2 ng/g, while it was 5 ng/g for black pepper, and turmeric. When the method was quantitatively validated for 199 pesticides, the limit of quantification (LOQ) was mostly at 10 ng/g (excluding black pepper, and turmeric with LOQ = 20 ng/g) with recoveries within 70-120%, and precision-RSDs <20%. Furthermore, the method allowed the identification of suspected non-target analytes through retrospective search of the accurate mass of the compound-specific precursor and product ions. Compared to LC-MS/MS, the quantitative performance of this Orbitrap-MS method had agreements in residue values between 78-100%. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Finding the joker among the maize endogenous reference genes for genetically modified organism (GMO) detection.

    PubMed

    Paternò, Annalisa; Marchesi, Ugo; Gatto, Francesco; Verginelli, Daniela; Quarchioni, Cinzia; Fusco, Cristiana; Zepparoni, Alessia; Amaddeo, Demetrio; Ciabatti, Ilaria

    2009-12-09

    The comparison of five real-time polymerase chain reaction (PCR) methods targeted at maize ( Zea mays ) endogenous sequences is reported. PCR targets were the alcohol dehydrogenase (adh) gene for three methods and high-mobility group (hmg) gene for the other two. The five real-time PCR methods have been checked under repeatability conditions at several dilution levels on both pooled DNA template from several genetically modified (GM) maize certified reference materials (CRMs) and single CRM DNA extracts. Slopes and R(2) coefficients of all of the curves obtained from the adopted regression model were compared within the same method and among all of the five methods, and the limit of detection and limit of quantitation were analyzed for each PCR system. Furthermore, method equivalency was evaluated on the basis of the ability to estimate the target haploid genome copy number at each concentration level. Results indicated that, among the five methods tested, one of the hmg-targeted PCR systems can be considered equivalent to the others but shows the best regression parameters and a higher repeteability along the dilution range. Thereby, it is proposed as a valid module to be coupled to different event-specific real-time PCR for maize genetically modified organism (GMO) quantitation. The resulting practicability improvement on the analytical control of GMOs is discussed.

  11. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Peak flood estimation using gene expression programming

    NASA Astrophysics Data System (ADS)

    Zorn, Conrad R.; Shamseldin, Asaad Y.

    2015-12-01

    As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.

  13. New Material for Surface-Enhanced Raman Spectroscopy

    NASA Technical Reports Server (NTRS)

    Farquharson, Stuart; Nelson, Chad; Lee, Yuan

    2004-01-01

    A chemical method of synthesis and application of coating materials that are especially suitable for surface-enhanced Raman spectroscopy (SERS) has been developed. The purpose of this development is to facilitate the utilization of the inherently high sensitivity of SERS to detect chemicals of interest (analytes) in trace amounts, without need for lengthy sample preparation. Up to now, the use of SERS has not become routine because the methods available have not been able to reproduce sampling conditions and provide quantitative measurements. In contrast, the coating materials of the present method enable analysis with minimum preparation of samples, and SERS measurements made using these materials are reproducible and reversible. Moreover, unlike in methods investigated in prior efforts to implement SERS, sampling is not restricted to such specific environments as electrolytes or specific solvents. The coating materials of this method are porous glasses, formed in sol-gel processes, that contain small particles of gold or silver metal. Materials of this type can be applied to the sample-contact surfaces of a variety of sampling and sensing devices, including glass slides, glass vials, fiber-optic probes, and glass tubes. Glass vials with their insides coated according to this method are particularly convenient for SERS to detect trace chemicals in solutions: One simply puts a sample solution containing the analyte(s) into a vial, then puts the vial into a Raman spectrometer for analysis. The chemical ingredients and the physical conditions of the sol-gel process have been selected so that the porous glass formed incorporates particles of the desired metal with size(s) to match the wavelength(s) of the SERS excitation laser in order to optimize the generation of surface plasmons. The ingredients and processing conditions have further been chosen to tailor the porosity and polarity of the glass to optimize the sample flow and the interaction between the analyte(s) and the plasmon field that generates Raman photons. The porous silica network of a sol-gel glass creates a unique environment for stabilizing SERS-active metal particles. Relative to other material structures that could be considered for SERS, the porous silica network offers higher specific surface area and thus greater interaction between analyte molecules and metal particles. Efforts to perform SERS measurements with the help of sampling devices coated by this method have been successful. In tests, numerous organic and inorganic chemicals were analyzed in several solvents, including water. The results of the tests indicate that the SERS measurements were reproducible within 10 percent and linear over five orders of magnitude. One measure of the limits of detectability of chemicals in these tests was found to be a concentration of 300 parts per billion. Further development may eventually make it possible to realize the full potential sensitivity of SERS for detecting some analytes in quantities as small as a single molecule.

  14. Assessment of catchments' flooding potential: a physically-based analytical tool

    NASA Astrophysics Data System (ADS)

    Botter, G.; Basso, S.; Schirmer, M.

    2016-12-01

    The assessment of the flooding potential of river catchments is critical in many research and applied fields, ranging from river science and geomorphology to urban planning and the insurance industry. Predicting magnitude and frequency of floods is key to prevent and mitigate the negative effects of high flows, and has therefore long been the focus of hydrologic research. Here, the recurrence intervals of seasonal flow maxima are estimated through a novel physically-based analytic approach, which links the extremal distribution of streamflows to the stochastic dynamics of daily discharge. An analytical expression of the seasonal flood-frequency curve is provided, whose parameters embody climate and landscape attributes of the contributing catchment and can be estimated from daily rainfall and streamflow data. Only one parameter, which expresses catchment saturation prior to rainfall events, needs to be calibrated on the observed maxima. The method has been tested in a set of catchments featuring heterogeneous daily flow regimes. The model is able to reproduce characteristic shapes of flood-frequency curves emerging in erratic and persistent flow regimes and provides good estimates of seasonal flow maxima in different climatic regions. Performances are steady when the magnitude of events with return times longer than the available sample size is estimated. This makes the approach especially valuable for regions affected by data scarcity.

  15. Performance of Traditional and Molecular Methods for Detecting Biological Agents in Drinking Water

    USGS Publications Warehouse

    Francy, Donna S.; Bushon, Rebecca N.; Brady, Amie M.G.; Bertke, Erin E.; Kephart, Christopher M.; Likirdopulos, Christina A.; Mailot, Brian E.; Schaefer, Frank W.; Lindquist, H.D. Alan

    2009-01-01

    To reduce the impact from a possible bioterrorist attack on drinking-water supplies, analytical methods are needed to rapidly detect the presence of biological agents in water. To this end, 13 drinking-water samples were collected at 9 water-treatment plants in Ohio to assess the performance of a molecular method in comparison to traditional analytical methods that take longer to perform. Two 100-liter samples were collected at each site during each sampling event; one was seeded in the laboratory with six biological agents - Bacillus anthracis (B. anthracis), Burkholderia cepacia (as a surrogate for Bu. pseudomallei), Francisella tularensis (F. tularensis), Salmonella Typhi (S. Typhi), Vibrio cholerae (V. cholerae), and Cryptospordium parvum (C. parvum). The seeded and unseeded samples were processed by ultrafiltration and analyzed by use of quantiative polymerase chain reaction (qPCR), a molecular method, and culture methods for bacterial agents or the immunomagnetic separation/fluorescent antibody (IMS/FA) method for C. parvum as traditional methods. Six replicate seeded samples were also processed and analyzed. For traditional methods, recoveries were highly variable between samples and even between some replicate samples, ranging from below detection to greater than 100 percent. Recoveries were significantly related to water pH, specific conductance, and dissolved organic carbon (DOC) for all bacteria combined by culture methods, but none of the water-quality characteristics tested were related to recoveries of C. parvum by IMS/FA. Recoveries were not determined by qPCR because of problems in quantifying organisms by qPCR in the composite seed. Instead, qPCR results were reported as detected, not detected (no qPCR signal), or +/- detected (Cycle Threshold or 'Ct' values were greater than 40). Several sample results by qPCR were omitted from the dataset because of possible problems with qPCR reagents, primers, and probes. For the remaining 14 qPCR results (including some replicate samples), F. tularensis and V. cholerae were detected in all samples after ultrafiltration, B. anthracis was detected in 13 and +/- detected in 1 sample, and C. parvum was detected in 9 and +/- detected in 4 samples. Bu. cepacia was detected in nine samples, +/- detected in two samples, and not detected in three samples (for two out of three samples not detected, a different strain was used). The qPCR assay for V. cholerae provided two false positive - but late - signals in one unseeded sample. Numbers found by qPCR after ultrafiltration were significantly or nearly significantly related to those found by traditional methods for B. anthracis, F. tularensis, and V. cholerae but not for Bu. cepacia and C. parvum. A qPCR assay for S. Typhi was not available. The qPCR method can be used to rapidly detect B. anthracis, F. tularensis, and V. cholerae with some certainty in drinking-water samples, but additional work would be needed to optimize and test qPCR for Bu. cepacia and C. parvum and establish relations to traditional methods. The specificity for the V. cholerae assay needs to be further investigated. Evidence is provided that ultrafiltration and qPCR are promising methods to rapidly detect biological agents in the Nation's drinking-water supplies and thus reduce the impact and consequences from intentional bioterrorist events. To our knowledge, this is the first study to compare the use of traditional and qPCR methods to detect biological agents in large-volume drinking-water samples.

  16. Genetics-based methods for detection of Salmonella spp. in foods.

    PubMed

    Mozola, Mark A

    2006-01-01

    Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.

  17. Event-Specific Cannabis Use and Use-Related Impairment: The Relationship to Campus Traditions

    PubMed Central

    Buckner, Julia D.; Henslee, Amber M.; Jeffries, Emily R.

    2015-01-01

    Objective: Despite high rates of college cannabis use, little work has identified high-risk cannabis use events. For instance, Mardi Gras (MG) and St. Patrick’s Day (SPD) are characterized by more college drinking, yet it is unknown whether they are also related to greater cannabis use. Further, some campuses may have traditions that emphasize substance use during these events, whereas other campuses may not. Such campus differences may affect whether students use cannabis during specific events. The present study tested whether MG and SPD were related to more cannabis use at two campuses with different traditions regarding MG and SPD. Further, given that Campus A has specific traditions regarding MG whereas Campus B has specific traditions regarding SPD, cross-campus differences in event-specific use were examined. Method: Current cannabis-using undergraduates (N = 154) at two campuses completed an online survey of event-specific cannabis use and event-specific cannabis-related problems. Results: Participants used more cannabis during MG and SPD than during a typical weekday, typical day on which the holiday fell, and a holiday unrelated to cannabis use (Presidents’ Day). Among those who engaged in event-specific use, MG and SPD cannabis use was greater than typical weekend use. Campus differences were observed. For example, Campus A reported more cannabis-related problems during MG than SPD, whereas Campus B reported more problems during SPD than MG. Conclusions: Specific holidays were associated with more cannabis use and use-related problems. Observed between-campus differences indicate that campus traditions may affect event-specific cannabis use and use-related problems. PMID:25785793

  18. Assessing Traumatic Event Exposure: Comparing the Traumatic Life Events Questionnaire to the Structured Clinical Interview for "DSM-IV"

    ERIC Educational Resources Information Center

    Peirce, Jessica M.; Burke, Christopher K.; Stoller, Kenneth B.; Neufeld, Karin J.; Brooner, Robert K.

    2009-01-01

    Post-traumatic stress disorder (PTSD) diagnosis requires first identifying a traumatic event, but very few studies have evaluated methods of potential traumatic event assessment and their impact on PTSD diagnosis. The authors compared a behaviorally specific comprehensive multiple-item traumatic event measure with a single-item measure to…

  19. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  20. MEASUREMENT OF VOLATILE ORGANIC COMPOUNDS IN EXHALED BREATH AS COLLECTED IN EVACUATED ELECTROPOLISHED CANISTERS

    EPA Science Inventory

    A set of three complementary analytical methods were developed specifically for exhaled breath as collected in evacuated stainless steel canisters using gas chromatography - mass spectrometry detection. The first is a screening method to quantify the carbon dioxide component (gen...

  1. Quality in laboratory medicine: 50years on.

    PubMed

    Plebani, Mario

    2017-02-01

    The last 50years have seen substantial changes in the landscape of laboratory medicine: its role in modern medicine is in evolution and the quality of laboratory services is changing. The need to control and improve quality in clinical laboratories has grown hand in hand with the growth in technological developments leading to an impressive reduction of analytical errors over time. An essential cause of this impressive improvement has been the introduction and monitoring of quality indicators (QIs) such as the analytical performance specifications (in particular bias and imprecision) based on well-established goals. The evolving landscape of quality and errors in clinical laboratories moved first from analytical errors to all errors performed within the laboratory walls, subsequently to errors in laboratory medicine (including errors in test requesting and result interpretation), and finally, to a focus on errors more frequently associated with adverse events (laboratory-associated errors). After decades in which clinical laboratories have focused on monitoring and improving internal indicators of analytical quality, efficiency and productivity, it is time to shift toward indicators of total quality, clinical effectiveness and patient outcomes. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. An isotope-dilution standard GC/MS/MS method for steroid hormones in water

    USGS Publications Warehouse

    Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.

    2013-01-01

    An isotope-dilution quantification method was developed for 20 natural and synthetic steroid hormones and additional compounds in filtered and unfiltered water. Deuterium- or carbon-13-labeled isotope-dilution standards (IDSs) are added to the water sample, which is passed through an octadecylsilyl solid-phase extraction (SPE) disk. Following extract cleanup using Florisil SPE, method compounds are converted to trimethylsilyl derivatives and analyzed by gas chromatography with tandem mass spectrometry. Validation matrices included reagent water, wastewater-affected surface water, and primary (no biological treatment) and secondary wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100%; with overall relative standard deviation of 28%. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples analyzed in 2009–2010 ranged from 84–104%, with relative standard deviations of 6–36%. Detection levels estimated using ASTM International’s D6091–07 procedure range from 0.4 to 4 ng/L for 17 analytes. Higher censoring levels of 100 ng/L for bisphenol A and 200 ng/L for cholesterol and 3-beta-coprostanol are used to prevent bias and false positives associated with the presence of these analytes in blanks. Absolute method recoveries of the IDSs provide sample-specific performance information and guide data reporting. Careful selection of labeled compounds for use as IDSs is important because both inexact IDS-analyte matches and deuterium label loss affect an IDS’s ability to emulate analyte performance. Six IDS compounds initially tested and applied in this method exhibited deuterium loss and are not used in the final method.

  3. Utilizing global data to estimate analytical performance on the Sigma scale: A global comparative analysis of methods, instruments, and manufacturers through external quality assurance and proficiency testing programs.

    PubMed

    Westgard, Sten A

    2016-06-01

    To assess the analytical performance of instruments and methods through external quality assessment and proficiency testing data on the Sigma scale. A representative report from five different EQA/PT programs around the world (2 US, 1 Canadian, 1 UK, and 1 Australasian) was accessed. The instrument group standard deviations were used as surrogate estimates of instrument imprecision. Performance specifications from the US CLIA proficiency testing criteria were used to establish a common quality goal. Then Sigma-metrics were calculated to grade the analytical performance. Different methods have different Sigma-metrics for each analyte reviewed. Summary Sigma-metrics estimate the percentage of the chemistry analytes that are expected to perform above Five Sigma, which is where optimized QC design can be implemented. The range of performance varies from 37% to 88%, exhibiting significant differentiation between instruments and manufacturers. Median Sigmas for the different manufacturers in three analytes (albumin, glucose, sodium) showed significant differentiation. Chemistry tests are not commodities. Quality varies significantly from manufacturer to manufacturer, instrument to instrument, and method to method. The Sigma-assessments from multiple EQA/PT programs provide more insight into the performance of methods and instruments than any single program by itself. It is possible to produce a ranking of performance by manufacturer, instrument and individual method. Laboratories seeking optimal instrumentation would do well to consult this data as part of their decision-making process. To confirm that these assessments are stable and reliable, a longer term study should be conducted that examines more results over a longer time period. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  4. Advances in spectroscopic methods for quantifying soil carbon

    USGS Publications Warehouse

    Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean

    2012-01-01

    The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.

  5. A Comparison of the OSHA Modified NIOSH Physical and Chemical Analytical Method (P and CAM) 304 and the Dust Trak Photometric Aerosol Sampler for 0-Chlorobenzylidine Malonitrile

    DTIC Science & Technology

    2013-04-02

    photometric particle counting instrument, DustTrak, to the established OSHA modified NIOSH P&CAM 304 method to determine correlation between the two...study compared the non-specific, rapid photometric particle counting instrument, DustTrak, to the established OSHA modified NIOSH P&CAM 304 method...mask confidence training (27) . This study will compare a direct reading, non-specific photometric particle count instrument (DustTrak TSI Model

  6. Exploring the full natural variability of eruption sizes within probabilistic hazard assessment of tephra dispersal

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Sandri, Laura; Costa, Antonio; Tonini, Roberto; Folch, Arnau; Macedonio, Giovanni

    2014-05-01

    The intrinsic uncertainty and variability associated to the size of next eruption strongly affects short to long-term tephra hazard assessment. Often, emergency plans are established accounting for the effects of one or a few representative scenarios (meant as a specific combination of eruptive size and vent position), selected with subjective criteria. On the other hand, probabilistic hazard assessments (PHA) consistently explore the natural variability of such scenarios. PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping possible eruption sizes and vent positions in classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA results from combining simulations considering different volcanological and meteorological conditions through a weight given by their specific probability of occurrence. However, volcanological parameters, such as erupted mass, eruption column height and duration, bulk granulometry, fraction of aggregates, typically encompass a wide range of values. Because of such a variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. Here we propose a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological inputs are chosen by using a stratified sampling method. This procedure allows avoiding the bias introduced by selecting single representative scenarios and thus neglecting most of the intrinsic eruptive variability. When considering within-size-class variability, attention must be paid to appropriately weight events falling within the same size class. While a uniform weight to all the events belonging to a size class is the most straightforward idea, this implies a strong dependence on the thresholds dividing classes: under this choice, the largest event of a size class has a much larger weight than the smallest event of the subsequent size class. In order to overcome this problem, in this study, we propose an innovative solution able to smoothly link the weight variability within each size class to the variability among the size classes through a common power law, and, simultaneously, respect the probability of different size classes conditional to the occurrence of an eruption. Embedding this procedure into the Bayesian Event Tree scheme enables for tephra fall PHA, quantified through hazard curves and maps representing readable results applicable in planning risk mitigation actions, and for the quantification of its epistemic uncertainties. As examples, we analyze long-term tephra fall PHA at Vesuvius and Campi Flegrei. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained clearly show that PHA accounting for the whole natural variability significantly differs from that based on a representative scenarios, as in volcanic hazard common practice.

  7. Towards Seismic Tomography Based Upon Adjoint Methods

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Liu, Q.; Tape, C.; Maggi, A.

    2006-12-01

    We outline the theory behind tomographic inversions based on 3D reference models, fully numerical 3D wave propagation, and adjoint methods. Our approach involves computing the Fréchet derivatives for tomographic inversions via the interaction between a forward wavefield, propagating from the source to the receivers, and an `adjoint' wavefield, propagating from the receivers back to the source. The forward wavefield is computed using a spectral-element method (SEM) and a heterogeneous wave-speed model, and stored as synthetic seismograms at particular receivers for which there is data. We specify an objective or misfit function that defines a measure of misfit between data and synthetics. For a given receiver, the differences between the data and the synthetics are time reversed and used as the source of the adjoint wavefield. For each earthquake, the interaction between the regular and adjoint wavefields is used to construct finite-frequency sensitivity kernels, which we call event kernel. These kernels may be thought of as weighted sums of measurement-specific banana-donut kernels, with weights determined by the measurements. The overall sensitivity is simply the sum of event kernels, which defines the misfit kernel. The misfit kernel is multiplied by convenient orthonormal basis functions that are embedded in the SEM code, resulting in the gradient of the misfit function, i.e., the Fréchet derivatives. The misfit kernel is multiplied by convenient orthonormal basis functions that are embedded in the SEM code, resulting in the gradient of the misfit function, i.e., the Fréchet derivatives. A conjugate gradient algorithm is used to iteratively improve the model while reducing the misfit function. Using 2D examples for Rayleigh wave phase-speed maps of southern California, we illustrate the construction of the gradient and the minimization algorithm, and consider various tomographic experiments, including source inversions, structural inversions, and joint source-structure inversions. We also illustrate the characteristics of these 3D finite-frequency kernels based upon adjoint simulations for a variety of global arrivals, e.g., Pdiff, P'P', and SKS, and we illustrate how the approach may be used to investigate body- and surface-wave anisotropy. In adjoint tomography any time segment in which the data and synthetics match reasonably well is suitable for measurement, and this implies a much greater number of phases per seismogram can be used compared to classical tomography in which the sensitivity of the measurements is determined analytically for specific arrivals, e.g., P. We use an automated picking algorithm based upon short-term/long-term averages and strict phase and amplitude anomaly criteria to determine arrivals and time windows suitable for measurement. For shallow global events the algorithm typically identifies of the order of 1000~windows suitable for measurement, whereas for a deep event the number can reach 4000. For southern California earthquakes the number of phases is of the order of 100 for a magnitude 4.0 event and up to 450 for a magnitude 5.0 event. We will show examples of event kernels for both global and regional earthquakes. These event kernels form the basis of adjoint tomography.

  8. Development and validation of an UHPLC-MS/MS method for β2-agonists quantification in human urine and application to clinical samples.

    PubMed

    Bozzolino, Cristina; Leporati, Marta; Gani, Federica; Ferrero, Cinzia; Vincenti, Marco

    2018-02-20

    A fast analytical method for the simultaneous detection of 24 β 2 -agonists in human urine was developed and validated. The method covers the therapeutic drugs most commonly administered, but also potentially abused β 2 -agonists. The procedure is based on enzymatic deconjugation with β-glucuronidase followed by SPE clean up using mixed-phase cartridges with both ion-exchange and lipophilic properties. Instrumental analysis conducted by UHPLC-MS/MS allowed high peak resolution and rapid chromatographic separation, with reduced time and costs. The method was fully validated according ISO 17025:2005 principles. The following parameters were determined for each analyte: specificity, selectivity, linearity, limit of detection, limit of quantification, precision, accuracy, matrix effect, recovery and carry-over. The method was tested on real samples obtained from patients subjected to clinical treatment under chronic or acute therapy with either formoterol, indacaterol, salbutamol, or salmeterol. The drugs were administered using pressurized metered dose inhalers. All β 2 -agonists administered to the patients were detected in the real samples. The method proved adequate to accurately measure the concentration of these analytes in the real samples. The observed analytical data are discussed with reference to the administered dose and the duration of the therapy. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Establishment and application of event-specific polymerase chain reaction methods for two genetically modified soybean events, A2704-12 and A5547-127.

    PubMed

    Li, Xiang; Pan, Liangwen; Li, Junyi; Zhang, Qigang; Zhang, Shuya; Lv, Rong; Yang, Litao

    2011-12-28

    For implementation of the issued regulations and labeling policies for genetically modified organism (GMO) supervision, the polymerase chain reaction (PCR) method has been widely used due to its high specificity and sensitivity. In particular, use of the event-specific PCR method based on the flanking sequence of transgenes has become the primary trend. In this study, both qualitative and quantitative PCR methods were established on the basis of the 5' flanking sequence of transgenic soybean A2704-12 and the 3' flanking sequence of transgenic soybean A5547-127, respectively. In qualitative PCR assays, the limits of detection (LODs) were 10 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127. In quantitative real-time PCR assays, the LODs were 5 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127, and the limits of quantification (LOQs) were 10 copies for both. Low bias and acceptable SD and RSD values were also achieved in quantification of four blind samples using the developed real-time PCR assays. In addition, the developed PCR assays for the two transgenic soybean events were used for routine analysis of soybean samples imported to Shanghai in a 6 month period from October 2010 to March 2011. A total of 27 lots of soybean from the United States and Argentina were analyzed: 8 lots from the Unites States were found to have the GM soybean A2704-12 event, and the GM contents were <1.5% in all eight analyzed lots. On the contrary, no GM soybean A5547-127 content was found in any of the eight lots. These results demonstrated that the established event-specific qualitative and quantitative PCR methods could be used effectively in routine identification and quantification of GM soybeans A2704-12 and A5547-127 and their derived products.

  10. An Integrated Approach for the Large-Scale Simulation of Sedimentary Basins to Study Seismic Wave Amplification

    NASA Astrophysics Data System (ADS)

    Poursartip, B.

    2015-12-01

    Seismic hazard assessment to predict the behavior of infrastructures subjected to earthquake relies on ground motion numerical simulation because the analytical solution of seismic waves is limited to only a few simple geometries. Recent advances in numerical methods and computer architectures make it ever more practical to reliably and quickly obtain the near-surface response to seismic events. The key motivation stems from the need to access the performance of sensitive components of the civil infrastructure (nuclear power plants, bridges, lifelines, etc), when subjected to realistic scenarios of seismic events. We discuss an integrated approach that deploys best-practice tools for simulating seismic events in arbitrarily heterogeneous formations, while also accounting for topography. Specifically, we describe an explicit forward wave solver based on a hybrid formulation that couples a single-field formulation for the computational domain with an unsplit mixed-field formulation for Perfectly-Matched-Layers (PMLs and/or M-PMLs) used to limit the computational domain. Due to the material heterogeneity and the contrasting discretization needs it imposes, an adaptive time solver is adopted. We use a Runge-Kutta-Fehlberg time-marching scheme that adjusts optimally the time step such that the local truncation error rests below a predefined tolerance. We use spectral elements for spatial discretization, and the Domain Reduction Method in accordance with double couple method to allow for the efficient prescription of the input seismic motion. Of particular interest to this development is the study of the effects idealized topographic features have on the surface motion when compared against motion results that are based on a flat-surface assumption. We discuss the components of the integrated approach we followed, and report the results of parametric studies in two and three dimensions, for various idealized topographic features, which show motion amplification that depends, as expected, on the relation between the topographic feature's characteristics and the dominant wavelength. Lastly, we report results involving three-dimensional simulations.

  11. Chemoselective synthesis and analysis of naturally occurring phosphorylated cysteine peptides

    PubMed Central

    Bertran-Vicente, Jordi; Penkert, Martin; Nieto-Garcia, Olaia; Jeckelmann, Jean-Marc; Schmieder, Peter; Krause, Eberhard; Hackenberger, Christian P. R.

    2016-01-01

    In contrast to protein O-phosphorylation, studying the function of the less frequent N- and S-phosphorylation events have lagged behind because they have chemical features that prevent their manipulation through standard synthetic and analytical methods. Here we report on the development of a chemoselective synthetic method to phosphorylate Cys side-chains in unprotected peptides. This approach makes use of a reaction between nucleophilic phosphites and electrophilic disulfides accessible by standard methods. We achieve the stereochemically defined phosphorylation of a Cys residue and verify the modification using electron-transfer higher-energy dissociation (EThcD) mass spectrometry. To demonstrate the use of the approach in resolving biological questions, we identify an endogenous Cys phosphorylation site in IICBGlc, which is known to be involved in the carbohydrate uptake from the bacterial phosphotransferase system (PTS). This new chemical and analytical approach finally allows further investigating the functions and significance of Cys phosphorylation in a wide range of crucial cellular processes. PMID:27586301

  12. The Use of Categorized Time-Trend Reporting of Radiation Oncology Incidents: A Proactive Analytical Approach to Improving Quality and Safety Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Anthony, E-mail: anthony.arnold@sesiahs.health.nsw.gov.a; Delaney, Geoff P.; Cassapi, Lynette

    Purpose: Radiotherapy is a common treatment for cancer patients. Although incidence of error is low, errors can be severe or affect significant numbers of patients. In addition, errors will often not manifest until long periods after treatment. This study describes the development of an incident reporting tool that allows categorical analysis and time trend reporting, covering first 3 years of use. Methods and Materials: A radiotherapy-specific incident analysis system was established. Staff members were encouraged to report actual errors and near-miss events detected at prescription, simulation, planning, or treatment phases of radiotherapy delivery. Trend reporting was reviewed monthly. Results: Reportsmore » were analyzed for the first 3 years of operation (May 2004-2007). A total of 688 reports was received during the study period. The actual error rate was 0.2% per treatment episode. During the study period, the actual error rates reduced significantly from 1% per year to 0.3% per year (p < 0.001), as did the total event report rates (p < 0.0001). There were 3.5 times as many near misses reported compared with actual errors. Conclusions: This system has allowed real-time analysis of events within a radiation oncology department to a reduced error rate through focus on learning and prevention from the near-miss reports. Plans are underway to develop this reporting tool for Australia and New Zealand.« less

  13. Smart phone: a popular device supports amylase activity assay in fisheries research.

    PubMed

    Thongprajukaew, Karun; Choodum, Aree; Sa-E, Barunee; Hayee, Ummah

    2014-11-15

    Colourimetric determinations of amylase activity were developed based on a standard dinitrosalicylic acid (DNS) staining method, using maltose as the analyte. Intensities and absorbances of red, green and blue (RGB) were obtained with iPhone imaging and Adobe Photoshop image analysis. Correlation of green and analyte concentrations was highly significant, and the accuracy of the developed method was excellent in analytical performance. The common iPhone has sufficient imaging ability for accurate quantification of maltose concentrations. Detection limits, sensitivity and linearity were comparable to a spectrophotometric method, but provided better inter-day precision. In quantifying amylase specific activity from a commercial source (P>0.02) and fish samples (P>0.05), differences compared with spectrophotometric measurements were not significant. We have demonstrated that iPhone imaging with image analysis in Adobe Photoshop has potential for field and laboratory studies of amylase. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Dynamical Behaviors between the PM10 and the meteorological factor using the detrended cross-correlation analysis method

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Lee, Dong-In

    2013-04-01

    There is considerable interest in cross-correlations in collective modes of real data from atmospheric geophysics, seismology, finance, physiology, genomics, and nanodevices. If two systems interact mutually, that interaction gives rise to collective modes. This phenomenon is able to be analyzed using the cross-correlation of traditional methods, random matrix theory, and the detrended cross-correlation analysis method. The detrended cross-correlation analysis method was used in the past to analyze several models such as autoregressive fractionally integrated moving average processes, stock prices and their trading volumes, and taxi accidents. Particulate matter is composed of the organic and inorganic mixtures such as the natural sea salt, soil particle, vehicles exhaust, construction dust, and soot. The PM10 is known as the particle with the aerodynamic diameter (less than 10 microns) that is able to enter the human respiratory system. The PM10 concentration has an effect on the climate change by causing an unbalance of the global radiative equilibrium through the direct effect that blocks the stoma of plants and cuts off the solar radiation, different from the indirect effect that changes the optical property of clouds, cloudiness, and lifetime of clouds. Various factors contribute to the degree of the PM10 concentration. Notable among these are the land-use types, surface vegetation coverage, as well as meteorological factors. In this study, we analyze and simulate cross-correlations in time scales between the PM10 concentration and the meteorological factor (among temperature, wind speed and humidity) using the detrended cross-correlation analysis method through the removal of specific trends at eight cities in the Korean peninsula. We divide time series data into Asian dust events and non-Asian dust events to analyze the change of meteorological factors on the fluctuation of PM10 the concentration during Asian dust events. In particular, our result is compared to analytic findings from references published in all nations. ----------------------------------------------------------------- This work was supported by Center for the ASER (CATER 2012-6110) and by the NRFK through a grant provided by the KMEST(No.K1663000201107900).

  15. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    ERIC Educational Resources Information Center

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  16. Querying and Extracting Timeline Information from Road Traffic Sensor Data

    PubMed Central

    Imawan, Ardi; Indikawati, Fitri Indra; Kwon, Joonho; Rao, Praveen

    2016-01-01

    The escalation of traffic congestion in urban cities has urged many countries to use intelligent transportation system (ITS) centers to collect historical traffic sensor data from multiple heterogeneous sources. By analyzing historical traffic data, we can obtain valuable insights into traffic behavior. Many existing applications have been proposed with limited analysis results because of the inability to cope with several types of analytical queries. In this paper, we propose the QET (querying and extracting timeline information) system—a novel analytical query processing method based on a timeline model for road traffic sensor data. To address query performance, we build a TQ-index (timeline query-index) that exploits spatio-temporal features of timeline modeling. We also propose an intuitive timeline visualization method to display congestion events obtained from specified query parameters. In addition, we demonstrate the benefit of our system through a performance evaluation using a Busan ITS dataset and a Seattle freeway dataset. PMID:27563900

  17. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    PubMed

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  18. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Development and Interlaboratory Validation of a Simple Screening Method for Genetically Modified Maize Using a ΔΔC(q)-Based Multiplex Real-Time PCR Assay.

    PubMed

    Noguchi, Akio; Nakamura, Kosuke; Sakata, Kozue; Sato-Fukuda, Nozomi; Ishigaki, Takumi; Mano, Junichi; Takabatake, Reona; Kitta, Kazumi; Teshima, Reiko; Kondo, Kazunari; Nishimaki-Mogami, Tomoko

    2016-04-19

    A number of genetically modified (GM) maize events have been developed and approved worldwide for commercial cultivation. A screening method is needed to monitor GM maize approved for commercialization in countries that mandate the labeling of foods containing a specified threshold level of GM crops. In Japan, a screening method has been implemented to monitor approved GM maize since 2001. However, the screening method currently used in Japan is time-consuming and requires generation of a calibration curve and experimental conversion factor (C(f)) value. We developed a simple screening method that avoids the need for a calibration curve and C(f) value. In this method, ΔC(q) values between the target sequences and the endogenous gene are calculated using multiplex real-time PCR, and the ΔΔC(q) value between the analytical and control samples is used as the criterion for determining analytical samples in which the GM organism content is below the threshold level for labeling of GM crops. An interlaboratory study indicated that the method is applicable independently with at least two models of PCR instruments used in this study.

  20. Immobilization of Fab' fragments onto substrate surfaces: A survey of methods and applications.

    PubMed

    Crivianu-Gaita, Victor; Thompson, Michael

    2015-08-15

    Antibody immobilization onto surfaces has widespread applications in many different fields. It is desirable to bind antibodies such that their fragment-antigen-binding (Fab) units are oriented away from the surface in order to maximize analyte binding. The immobilization of only Fab' fragments yields benefits over the more traditional whole antibody immobilization technique. Bound Fab' fragments display higher surface densities, yielding a higher binding capacity for the analyte. The nucleophilic sulfide of the Fab' fragments allows for specific orientations to be achieved. For biosensors, this indicates a higher sensitivity and lower detection limit for a target analyte. The last thirty years have shown tremendous progress in the immobilization of Fab' fragments onto gold, Si-based, polysaccharide-based, plastic-based, magnetic, and inorganic surfaces. This review will show the current scope of Fab' immobilization techniques available and illustrate methods employed to minimize non-specific adsorption of undesirables. Furthermore, a variety of examples will be given to show the versatility of immobilized Fab' fragments in different applications and future directions of the field will be addressed, especially regarding biosensors. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Magnetic ionic liquids in analytical chemistry: A review.

    PubMed

    Clark, Kevin D; Nacham, Omprakash; Purslow, Jeffrey A; Pierson, Stephen A; Anderson, Jared L

    2016-08-31

    Magnetic ionic liquids (MILs) have recently generated a cascade of innovative applications in numerous areas of analytical chemistry. By incorporating a paramagnetic component within the cation or anion, MILs exhibit a strong response toward external magnetic fields. Careful design of the MIL structure has yielded magnetoactive compounds with unique physicochemical properties including high magnetic moments, enhanced hydrophobicity, and the ability to solvate a broad range of molecules. The structural tunability and paramagnetic properties of MILs have enabled magnet-based technologies that can easily be added to the analytical method workflow, complement needed extraction requirements, or target specific analytes. This review highlights the application of MILs in analytical chemistry and examines the important structural features of MILs that largely influence their physicochemical and magnetic properties. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Metabolomics strategy for the mapping of volatile exometabolome from Saccharomyces spp. widely used in the food industry based on comprehensive two-dimensional gas chromatography.

    PubMed

    Martins, Cátia; Brandão, Tiago; Almeida, Adelaide; Rocha, Sílvia M

    2017-05-01

    Saccharomyces spp. are widely used in the food and beverages industries. Their cellular excreted metabolites are important for general quality of products and can contribute to product differentiation. This exploratory study presents a metabolomics strategy for the comprehensive mapping of cellular metabolites of two yeast species, Saccharomyces cerevisiae and S. pastorianus (both collected in an industrial context) through a multidimensional chromatography platform. Solid-phase microextraction was used as a sample preparation method. The yeast viability, a specific technological quality parameter, was also assessed. This untargeted analysis allowed the putative identification of 525 analytes, distributed over 14 chemical families, the origin of which may be explained through the pathways network associated with yeasts metabolism. The expression of the different metabolic pathways was similar for both species, event that seems to be yeast genus dependent. Nevertheless, these species showed different growth rates, which led to statistically different metabolites content. This was the first in-depth approach that characterizes the headspace content of S. cerevisiae and S. pastorianus species cultures. The combination of a sample preparation method capable of providing released volatile metabolites directly from yeast culture headspace with comprehensive two-dimensional gas chromatography was successful in uncovering a specific metabolomic pattern for each species. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Inverse scattering transform analysis of rogue waves using local periodization procedure

    NASA Astrophysics Data System (ADS)

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-07-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra.

  4. Inverse scattering transform analysis of rogue waves using local periodization procedure

    PubMed Central

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-01-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra. PMID:27385164

  5. International ring trial for the validation of an event-specific Golden Rice 2 quantitative real-time polymerase chain reaction method.

    PubMed

    Jacchia, Sara; Nardini, Elena; Bassani, Niccolò; Savini, Christian; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco

    2015-05-27

    This article describes the international validation of the quantitative real-time polymerase chain reaction (PCR) detection method for Golden Rice 2. The method consists of a taxon-specific assay amplifying a fragment of rice Phospholipase D α2 gene, and an event-specific assay designed on the 3' junction between transgenic insert and plant DNA. We validated the two assays independently, with absolute quantification, and in combination, with relative quantification, on DNA samples prepared in haploid genome equivalents. We assessed trueness, precision, efficiency, and linearity of the two assays, and the results demonstrate that both the assays independently assessed and the entire method fulfill European and international requirements for methods for genetically modified organism (GMO) testing, within the dynamic range tested. The homogeneity of the results of the collaborative trial between Europe and Asia is a good indicator of the robustness of the method.

  6. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  7. Applications of Aptamers as Sensors

    NASA Astrophysics Data System (ADS)

    Cho, Eun Jeong; Lee, Joo-Woon; Ellington, Andrew D.

    2009-07-01

    Aptamers are ligand-binding nucleic acids whose affinities and selectivities can rival those of antibodies. They have been adapted to analytical applications not only as alternatives to antibodies, but as unique reagents in their own right. In particular, aptamers can be readily site-specifically modified during chemical or enzymatic synthesis to incorporate particular reporters, linkers, or other moieties. Also, aptamer secondary structures can be engineered to undergo analyte-dependent conformational changes, which, in concert with the ability to specifically place chemical agents, opens up a wealth of possible signal transduction schemas, irrespective of whether the detection modality is optical, electrochemical, or mass based. Finally, because aptamers are nucleic acids, they are readily adapted to sequence- (and hence signal-) amplification methods. However, application of aptamers without a basic knowledge of their biochemistry or technical requirements can cause serious analytical difficulties.

  8. Microscale Concentration Measurements Using Laser Light Scattering Methods

    NASA Technical Reports Server (NTRS)

    Niederhaus, Charles; Miller, Fletcher

    2004-01-01

    The development of lab-on-a-chip devices for microscale biochemical assays has led to the need for microscale concentration measurements of specific analyses. While fluorescence methods are the current choice, this method requires developing fluorophore-tagged conjugates for each analyte of interest. In addition, fluorescent imaging is also a volume-based method, and can be limiting as smaller detection regions are required.

  9. A robust and versatile signal-on fluorescence sensing strategy based on SYBR Green I dye and graphene oxide

    PubMed Central

    Qiu, Huazhang; Wu, Namei; Zheng, Yanjie; Chen, Min; Weng, Shaohuang; Chen, Yuanzhong; Lin, Xinhua

    2015-01-01

    A robust and versatile signal-on fluorescence sensing strategy was developed to provide label-free detection of various target analytes. The strategy used SYBR Green I dye and graphene oxide as signal reporter and signal-to-background ratio enhancer, respectively. Multidrug resistance protein 1 (MDR1) gene and mercury ion (Hg2+) were selected as target analytes to investigate the generality of the method. The linear relationship and specificity of the detections showed that the sensitive and selective analyses of target analytes could be achieved by the proposed strategy with low detection limits of 0.5 and 2.2 nM for MDR1 gene and Hg2+, respectively. Moreover, the strategy was used to detect real samples. Analytical results of MDR1 gene in the serum indicated that the developed method is a promising alternative approach for real applications in complex systems. Furthermore, the recovery of the proposed method for Hg2+ detection was acceptable. Thus, the developed label-free signal-on fluorescence sensing strategy exhibited excellent universality, sensitivity, and handling convenience. PMID:25565810

  10. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  11. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  12. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  13. Quality control for federal clean water act and safe drinking water act regulatory compliance.

    PubMed

    Askew, Ed

    2013-01-01

    QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.

  14. Simultaneous determination of glucose, triglycerides, urea, cholesterol, albumin and total protein in human plasma by Fourier transform infrared spectroscopy: direct clinical biochemistry without reagents.

    PubMed

    Jessen, Torben E; Höskuldsson, Agnar T; Bjerrum, Poul J; Verder, Henrik; Sørensen, Lars; Bratholm, Palle S; Christensen, Bo; Jensen, Lene S; Jensen, Maria A B

    2014-09-01

    Direct measurement of chemical constituents in complex biologic matrices without the use of analyte specific reagents could be a step forward toward the simplification of clinical biochemistry. Problems related to reagents such as production errors, improper handling, and lot-to-lot variations would be eliminated as well as errors occurring during assay execution. We describe and validate a reagent free method for direct measurement of six analytes in human plasma based on Fourier-transform infrared spectroscopy (FTIR). Blood plasma is analyzed without any sample preparation. FTIR spectrum of the raw plasma is recorded in a sampling cuvette specially designed for measurement of aqueous solutions. For each analyte, a mathematical calibration process is performed by a stepwise selection of wavelengths giving the optimal least-squares correlation between the measured FTIR signal and the analyte concentration measured by conventional clinical reference methods. The developed calibration algorithms are subsequently evaluated for their capability to predict the concentration of the six analytes in blinded patient samples. The correlation between the six FTIR methods and corresponding reference methods were 0.87

  15. Generalisability in economic evaluation studies in healthcare: a review and case studies.

    PubMed

    Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A

    2004-12-01

    To review, and to develop further, the methods used to assess and to increase the generalisability of economic evaluation studies. Electronic databases. Methodological studies relating to economic evaluation in healthcare were searched. This included electronic searches of a range of databases, including PREMEDLINE, MEDLINE, EMBASE and EconLit, and manual searches of key journals. The case studies of a decision analytic model involved highlighting specific features of previously published economic studies related to generalisability and location-related variability. The case-study involving the secondary analysis of cost-effectiveness analyses was based on the secondary analysis of three economic studies using data from randomised trials. The factor most frequently cited as generating variability in economic results between locations was the unit costs associated with particular resources. In the context of studies based on the analysis of patient-level data, regression analysis has been advocated as a means of looking at variability in economic results across locations. These methods have generally accepted that some components of resource use and outcomes are exchangeable across locations. Recent studies have also explored, in cost-effectiveness analysis, the use of tests of heterogeneity similar to those used in clinical evaluation in trials. The decision analytic model has been the main means by which cost-effectiveness has been adapted from trial to non-trial locations. Most models have focused on changes to the cost side of the analysis, but it is clear that the effectiveness side may also need to be adapted between locations. There have been weaknesses in some aspects of the reporting in applied cost-effectiveness studies. These may limit decision-makers' ability to judge the relevance of a study to their specific situations. The case study demonstrated the potential value of multilevel modelling (MLM). Where clustering exists by location (e.g. centre or country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in economic evaluation studies have been discussed extensively in the literature relating to both trial-based and modelling studies. Regression-based methods are likely to offer a systematic approach to quantifying variability in patient-level data. In particular, MLM has the potential to facilitate estimates of cost-effectiveness, which both reflect the variation in costs and outcomes between locations and also enable the consistency of cost-effectiveness estimates between locations to be assessed directly. Decision analytic models will retain an important role in adapting the results of cost-effectiveness studies between locations. Recommendations for further research include: the development of methods of evidence synthesis which model the exchangeability of data across locations and allow for the additional uncertainty in this process; assessment of alternative approaches to specifying multilevel models to the analysis of cost-effectiveness data alongside multilocation randomised trials; identification of a range of appropriate covariates relating to locations (e.g. hospitals) in multilevel models; and further assessment of the role of econometric methods (e.g. selection models) for cost-effectiveness analysis alongside observational datasets, and to increase the generalisability of randomised trials.

  16. An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.

    PubMed

    Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K

    2007-08-01

    To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.

  17. The potential contributions of geographic information science to the study of social determinants of health in Iran.

    PubMed

    Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A

    2018-01-01

    Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.

  18. Accommodating subject and instrument variations in spectroscopic determinations

    DOEpatents

    Haas, Michael J [Albuquerque, NM; Rowe, Robert K [Corrales, NM; Thomas, Edward V [Albuquerque, NM

    2006-08-29

    A method and apparatus for measuring a biological attribute, such as the concentration of an analyte, particularly a blood analyte in tissue such as glucose. The method utilizes spectrographic techniques in conjunction with an improved instrument-tailored or subject-tailored calibration model. In a calibration phase, calibration model data is modified to reduce or eliminate instrument-specific attributes, resulting in a calibration data set modeling intra-instrument or intra-subject variation. In a prediction phase, the prediction process is tailored for each target instrument separately using a minimal number of spectral measurements from each instrument or subject.

  19. Microbial production of the drugs violacein and deoxyviolacein: analytical development and strain comparison.

    PubMed

    Rodrigues, André L; Göcke, Yvonne; Bolten, Christoph; Brock, Nelson L; Dickschat, Jeroen S; Wittmann, Christoph

    2012-04-01

    Violacein and deoxyviolacein display a broad range of interesting biological properties but their production is rarely distinguished due to the lack of suitable analytical methods. An HPLC method has been developed for the separation and quantification of violacein and deoxyviolacein and can determine the content of both molecules in microbial cultures. A comparison of different production microorganisms, including recombinant Escherichia coli and the natural producer Janthinobacterium lividum, revealed that the formation of violacein and deoxyviolacein is strain-specific but showed significant variation during growth although the ratio between the two compounds remained constant.

  20. Analytical recursive method to ascertain multisite entanglement in doped quantum spin ladders

    NASA Astrophysics Data System (ADS)

    Roy, Sudipto Singha; Dhar, Himadri Shekhar; Rakshit, Debraj; SenDe, Aditi; Sen, Ujjwal

    2017-08-01

    We formulate an analytical recursive method to generate the wave function of doped short-range resonating valence bond (RVB) states as a tool to efficiently estimate multisite entanglement as well as other physical quantities in doped quantum spin ladders. We prove that doped RVB ladder states are always genuine multipartite entangled. Importantly, our results show that within specific doping concentration and model parameter regimes, the doped RVB state essentially characterizes the trends of genuine multiparty entanglement in the exact ground states of the Hubbard model with large on-site interactions, in the limit that yields the t -J Hamiltonian.

  1. General Safety and Waste Management Related to SAM

    EPA Pesticide Factsheets

    The General Safety and Waste Management page offers section-specific safety and waste management details for chemicals, radiochemicals, pathogens, and biotoxins included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  2. In search of integrated specificity: comment on Denson, Spanovic, and Miller (2009).

    PubMed

    Miller, Gregory E

    2009-11-01

    Psychologists have long been interested in the integrated specificity hypothesis, which maintains that stressors elicit fairly distinct behavioral, emotional, and biological responses that are molded by selective pressures to meet specific demands from the environment. This issue of Psychological Bulletin features a meta-analytic review of the evidence for this proposition by T. F. Denson, M. Spanovic, and N. Miller. Their review concluded that the meta-analytic findings support the "core concept behind the integrated specificity model" (p. 845) and reveal that "within the context of a stressful event, organisms produce an integrated and coordinated response at multiple levels (i.e., cognitive, emotional, physiological)" (p. 845). I argue that conclusions such as this are unwarranted, given the data. Aside from some effects for cortisol, little evidence of specificity was presented, and most of the significant findings reported would be expected by chance alone. I also contend that Denson et al. failed to consider some important sources of evidence bearing on the specificity hypothesis, particularly how appraisals and emotions couple with autonomic nervous system endpoints and functional indices of immune response. If selective pressures did give rise to an integrated stress response, such pathways almost certainly would have been involved. By omitting such outcomes from the meta-analysis, Denson et al. overlooked what are probably the most definitive tests of the specificity hypothesis. As a result, the field is back where it started: with a lot of affection for the concept of integrated specificity but little in the way of definitive evidence to refute or accept it.

  3. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  4. An analytically based numerical method for computing view factors in real urban environments

    NASA Astrophysics Data System (ADS)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  5. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    PubMed

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  6. Quantitative microbial faecal source tracking with sampling guided by hydrological catchment dynamics.

    PubMed

    Reischer, G H; Haider, J M; Sommer, R; Stadler, H; Keiblinger, K M; Hornek, R; Zerobin, W; Mach, R L; Farnleitner, A H

    2008-10-01

    The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design.

  7. Thinking about thinking and feeling about feeling

    PubMed Central

    Moore, J.

    2000-01-01

    Traditional clinical psychology generally posits “mental” events that differ from “behavioral” events. Mental events are not publicly observable, take place in a different dimension from overt behavior, and are the topic of primary concern. For example, mental events are often taken to be causes of troublesome overt behavior. In addition, the mental events themselves may be regarded as troublesome, independent of their relation to any specific overt behavior. Therapy is usually aimed at fixing these troublesome mental events, under an assumption that improvement in the client's status will follow in due course. Behavior analysis has its own position on the relations among clinical matters, overt behavior, and such private events as thinking and feeling. In a behavior-analytic view, private events are behavioral phenomena rather than mental phenomena. They are not initiating causes of behavior; rather, they are themselves caused by antecedent conditions, but they may contribute to discriminative control over subsequent behavior, both verbal and nonverbal. Verbal processes are viewed as vitally important in understanding troublesome behavior. However, the circumstances that cause both the troublesome private events and the troublesome behavior in the first place still need to be addressed. Finally, clinical behavior analysis will need to market its insights into diagnosis and treatment very adroitly, because it rejects the mentalism upon which most traditional forms of therapy are predicated and the mentalism that most consumers expect to encounter. PMID:22478337

  8. Liquid Metering Centrifuge Sticks (LMCS): A Centrifugal Approach to Metering Known Sample Volumes for Colorimetric Solid Phase Extraction (C-SPE)

    NASA Technical Reports Server (NTRS)

    Gazda, Daniel B.; Schultz, John R.; Clarke, Mark S.

    2007-01-01

    Phase separation is one of the most significant obstacles encountered during the development of analytical methods for water quality monitoring in spacecraft environments. Removing air bubbles from water samples prior to analysis is a routine task on earth; however, in the absence of gravity, this routine task becomes extremely difficult. This paper details the development and initial ground testing of liquid metering centrifuge sticks (LMCS), devices designed to collect and meter a known volume of bubble-free water in microgravity. The LMCS uses centrifugal force to eliminate entrapped air and reproducibly meter liquid sample volumes for analysis with Colorimetric Solid Phase Extraction (C-SPE). C-SPE is a sorption-spectrophotometric platform that is being developed as a potential spacecraft water quality monitoring system. C-SPE utilizes solid phase extraction membranes impregnated with analyte-specific colorimetric reagents to concentrate and complex target analytes in spacecraft water samples. The mass of analyte extracted from the water sample is determined using diffuse reflectance (DR) data collected from the membrane surface and an analyte-specific calibration curve. The analyte concentration can then be calculated from the mass of extracted analyte and the volume of the sample analyzed. Previous flight experiments conducted in microgravity conditions aboard the NASA KC-135 aircraft demonstrated that the inability to collect and meter a known volume of water using a syringe was a limiting factor in the accuracy of C-SPE measurements. Herein, results obtained from ground based C-SPE experiments using ionic silver as a test analyte and either the LMCS or syringes for sample metering are compared to evaluate the performance of the LMCS. These results indicate very good agreement between the two sample metering methods and clearly illustrate the potential of utilizing centrifugal forces to achieve phase separation and metering of water samples in microgravity.

  9. Time-dependent inertia analysis of vehicle mechanisms

    NASA Astrophysics Data System (ADS)

    Salmon, James Lee

    Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.

  10. Non-sticky translocation of bio-molecules through Tween 20-coated solid-state nanopores in a wide pH range

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqing; Hu, Rui; Li, Ji; Tong, Xin; Diao, J. J.; Yu, Dapeng; Zhao, Qing

    2016-10-01

    Nanopore-based sensing technology is considered high-throughput and low-cost for single molecule detection, but solid-state nanopores have suffered from pore clogging issues. A simple Tween 20 coating method is applied to ensure long-term (several hours) non-sticky translocation of various types of bio-molecules through SiN nanopores in a wide pH range (4.0-13.0). We also emphasize the importance of choosing appropriate concentration of Tween 20 coating buffer for desired effect. By coating nanopores with a Tween 20 layer, we are able to differentiate between single-stranded DNA and double-stranded DNA, to identify drift-dominated domain for single-stranded DNA, to estimate BSA volume and to observe the shape of individual nucleosome translocation event without non-specific adsorption. The wide pH endurance from 4.0 to 13.0 and the broad types of detection analytes including nucleic acids, proteins, and biological complexes highlight the great application potential of Tween 20-coated solid-state nanopores.

  11. Modeling gene flow distribution within conventional fields and development of a simplified sampling method to quantify adventitious GM contents in maize

    PubMed Central

    Melé, Enric; Nadal, Anna; Messeguer, Joaquima; Melé-Messeguer, Marina; Palaudelmàs, Montserrat; Peñas, Gisela; Piferrer, Xavier; Capellades, Gemma; Serra, Joan; Pla, Maria

    2015-01-01

    Genetically modified (GM) crops have been commercially grown for two decades. GM maize is one of 3 species with the highest acreage and specific events. Many countries established a mandatory labeling of products containing GM material, with thresholds for adventitious presence, to support consumers’ freedom of choice. In consequence, coexistence systems need to be introduced to facilitate commercial culture of GM and non-GM crops in the same agricultural area. On modeling adventitious GM cross-pollination distribution within maize fields, we deduced a simple equation to estimate overall GM contents (%GM) of conventional fields, irrespective of its shape and size, and with no previous information on possible GM pollen donor fields. A sampling strategy was designed and experimentally validated in 19 agricultural fields. With 9 samples, %GM quantification requires just one analytical GM determination while identification of the pollen source needs 9 additional analyses. A decision support tool is provided. PMID:26596213

  12. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  13. Safety and Procedural Success of Left Atrial Appendage Exclusion With the Lariat Device: A Systematic Review of Published Reports and Analytic Review of the FDA MAUDE Database.

    PubMed

    Chatterjee, Saurav; Herrmann, Howard C; Wilensky, Robert L; Hirshfeld, John; McCormick, Daniel; Frankel, David S; Yeh, Robert W; Armstrong, Ehrin J; Kumbhani, Dharam J; Giri, Jay

    2015-07-01

    The Lariat device has received US Food and Drug Administration (FDA) 510(k) clearance for soft-tissue approximation and is being widely used off-label for left atrial appendage (LAA) exclusion. A comprehensive analysis of safety and effectiveness has not been reported. To perform a systematic review of published literature to assess safety and procedural success, defined as successful closure of the LAA during the index procedure, of the Lariat device. We performed a formal analytic review of the FDA MAUDE (Manufacturer and User Facility Device Experience) database to compile adverse event reports from real-world practice with the Lariat. For the systematic review, PubMed, EMBASE, CINAHL, and the Cochrane Library were searched from January 2007 through August 2014 to identify all studies reporting use of the Lariat device in 3 or more patients. The FDA MAUDE database was queried for adverse events reports related to Lariat use. Data were abstracted in duplicate by 2 physician reviewers. Events from published literature were pooled using a generic inverse variance weighting with a random effects model. Cumulative and individual adverse events were also reported using the FDA MAUDE data set. Procedural adverse events and procedural success. In the systematic review, 5 reports of Lariat device use in 309 participants were identified. Specific complications weighted for inverse of variance of individual studies were urgent need for cardiac surgery (2.3%; 7 of 309 procedures) and death (0.3%; 1 of 309 procedures). Procedural success was 90.3% (279 of 309 procedures). In the FDA MAUDE database, there were 35 unique reports of adverse events with use of the Lariat device. Among these, we identified 5 adverse event reports that noted pericardial effusion and death and an additional 23 reported urgent cardiac surgery without mention of death. This review of published reports and case reports identified risks of adverse events with off-label use of the Lariat device for LAA exclusion. Formal, controlled investigations into the safety and efficacy of the device for this indication are warranted.

  14. Patterns and Sequences: Interactive Exploration of Clickstreams to Understand Common Visitor Paths.

    PubMed

    Liu, Zhicheng; Wang, Yang; Dontcheva, Mira; Hoffman, Matthew; Walker, Seth; Wilson, Alan

    2017-01-01

    Modern web clickstream data consists of long, high-dimensional sequences of multivariate events, making it difficult to analyze. Following the overarching principle that the visual interface should provide information about the dataset at multiple levels of granularity and allow users to easily navigate across these levels, we identify four levels of granularity in clickstream analysis: patterns, segments, sequences and events. We present an analytic pipeline consisting of three stages: pattern mining, pattern pruning and coordinated exploration between patterns and sequences. Based on this approach, we discuss properties of maximal sequential patterns, propose methods to reduce the number of patterns and describe design considerations for visualizing the extracted sequential patterns and the corresponding raw sequences. We demonstrate the viability of our approach through an analysis scenario and discuss the strengths and limitations of the methods based on user feedback.

  15. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  16. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    PubMed

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  17. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications

    PubMed Central

    Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.

    2016-01-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126

  18. Quality specification and status of internal quality control of cardiac biomarkers in China from 2011 to 2016.

    PubMed

    Li, Tingting; Wang, Wei; Zhao, Haijian; He, Falin; Zhong, Kun; Yuan, Shuai; Wang, Zhiguo

    2017-09-07

    This study aimed to investigate the status of internal quality control (IQC) for cardiac biomarkers from 2011 to 2016 so that we can have overall knowledge of the precision level of measurements in China and set appropriate precision specifications. Internal quality control data of cardiac biomarkers, including creatinine kinase MB (CK-MB) (μg/L), CK-MB(U/L), myoglobin (Mb), cardiac troponin I (cTnI), cardiac troponin T (cTnT), and homocysteines (HCY), were collected by a web-based external quality assessment (EQA) system. Percentages of laboratories meeting five precision quality specifications for current coefficient of variations (CVs) were calculated. Then, appropriate precision specifications were chosen for these six analytes. Finally, the CVs and IQC practice were further analyzed with different grouping methods. The current CVs remained nearly constant for 6 years. cTnT had the highest pass rates every year against five specifications, whereas HCY had the lowest pass rates. Overall, most analytes had a satisfactory performance (pass rates >80%), except for HCY, if one-third TEa or the minimum specification were employed. When the optimal specification was applied, the performance of most analytes was frustrating (pass rates < 60%) except for cTnT. The appropriate precision specifications of Mb, cTnI, cTnT and HCY were set as current CVs less than 9.20%, 9.90%, 7.50%, 10.54%, 7.63%, and 6.67%, respectively. The data of IQC practices indicated wide variation and substantial progress. The precision performance of cTnT was already satisfying, while the other five analytes, especially HCY, were still frustrating; thus, ongoing investigation and continuous improvement for IQC are still needed. © 2017 Wiley Periodicals, Inc.

  19. Force and Conductance Spectroscopy of Single Molecule Junctions

    NASA Astrophysics Data System (ADS)

    Frei, Michael

    Investigation of mechanical properties of single molecule junctions is crucial to develop an understanding and enable control of single molecular junctions. This work presents an experimental and analytical approach that enables the statistical evaluation of force and simultaneous conductance data of metallic atomic point contacts and molecular junctions. A conductive atomic force microscope based break junction technique is developed to form single molecular junctions and collect conductance and force data simultaneously. Improvements of the optical components have been achieved through the use of a super-luminescent diode, enabling tremendous increases in force resolution. An experimental procedure to collect data for various molecular junctions has been developed and includes deposition, calibration, and analysis methods. For the statistical analysis of force, novel approaches based on two dimensional histograms and a direct force identification method are presented. The two dimensional method allows for an unbiased evaluation of force events that are identified using corresponding conductance signatures. This is not always possible however, and in these situations, the force based identification of junction rearrangement events is an attractive alternative method. This combined experimental and analytical approach is then applied to three studies: First, the impact of molecular backbones to the mechanical behavior of single molecule junctions is investigated and it is found that junctions formed with identical linkers but different backbone structure result in junctions with varying breaking forces. All molecules used show a clear molecular signature and force data can be evaluated using the 2D method. Second, the effects of the linker group used to attach molecules to gold electrodes are investigated. A study of four alkane molecules with different linkers finds a drastic difference in the evolution of donor-acceptor and covalently bonded molecules respectively. In fact, the covalent bond is found to significantly distort the metal electrode rearrangement such that junction rearrangement events can no longer be identified with a clean and well defined conductance signature. For this case, the force based identification process is used. Third, results for break junction measurements with different metals are presented. It is found that silver and palladium junctions rupture with forces different from those of gold contacts. In the case of silver experiments in ambient conditions, we can also identify oxygen impurities in the silver contact formation process, leading to force and conductance measurements of silver-oxygen structures. For the future, this work provides an experimental and analytical foundation that will enable insights into single molecule systems not previously accessible.

  20. Temperature dependence of single-event burnout in n-channel power MOSFET's

    NASA Astrophysics Data System (ADS)

    Johnson, G. H.; Schrimpf, R. D.; Galloway, K. F.; Koga, R.

    1994-03-01

    The temperature dependence of single-event burnout (SEB) in n-channel power metal-oxide-semiconductor field effect transistors (MOSFET's) is investigated experimentally and analytically. Experimental data are presented which indicate that the SEB susceptibility of the power MOSFET decreases with increasing temperature. A previously reported analytical model that describes the SEB mechanism is updated to include temperature variations. This model is shown to agree with the experimental trends.

  1. Analytical chemistry in water quality monitoring during manned space missions

    NASA Astrophysics Data System (ADS)

    Artemyeva, Anastasia A.

    2016-09-01

    Water quality monitoring during human spaceflights is essential. However, most of the traditional methods require sample collection with a subsequent ground analysis because of the limitations in volume, power, safety and gravity. The space missions are becoming longer-lasting; hence methods suitable for in-flight monitoring are demanded. Since 2009, water quality has been monitored in-flight with colorimetric methods allowing for detection of iodine and ionic silver. Organic compounds in water have been monitored with a second generation total organic carbon analyzer, which provides information on the amount of carbon in water at both the U.S. and Russian segments of the International Space Station since 2008. The disadvantage of this approach is the lack of compound-specific information. The recently developed methods and tools may potentially allow one to obtain in-flight a more detailed information on water quality. Namely, the microanalyzers based on potentiometric measurements were designed for online detection of chloride, potassium, nitrate ions and ammonia. The recent application of the current highly developed air quality monitoring system for water analysis was a logical step because most of the target analytes are the same in air and water. An electro-thermal vaporizer was designed, manufactured and coupled with the air quality control system. This development allowed for liberating the analytes from the aqueous matrix and further compound-specific analysis in the gas phase.

  2. Validation of the analytical method for the simultaneous determination of selected polybrominated diphenyl ethers, polychlorinated biphenyls and organochlorine pesticides in human blood serum by gas chromatography with microelectron capture detector.

    PubMed

    Matuszak, Małgorzata; Minorczyk, Maria; Góralczyk, Katarzyna; Hernik, Agnieszka; Struciński, Paweł; Liszewska, Monika; Czaja, Katarzyna; Korcz, Wojciech; Łyczewska, Monika; Ludwicki, Jan K

    2016-01-01

    Polybrominated diphenyl ethers (PBDEs) as other persistent organic pollutants like polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs) pose a significant hazard to human health, mainly due to interference with the endocrine system and carcinogenetic effects. Humans are exposed to these substances mainly through a food of animal origin. These pollutants are globally detected in human matrices which requires to dispose reliable and simple analytical method that would enable further studies to assess the exposure of specific human populations to these compounds. The purpose of this study was to modify and validate of the analytical procedure for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum samples. The analytical measurement was performed by GC-µECD following preparation of serum samples (denaturation, multiple extraction, lipid removal). Identity of the compounds was confirmed by GC-MS. The method was characterised by the appropriate linearity, good repeatability (CV below 20%). The recoveries ranged from 52.9 to 125.0% depending on compound and level of fortification. The limit of quantification was set at 0.03 ng mL(-1) of serum. The modified analytical method proved to be suitable for the simultaneous determination of selected PBDEs, PCBs and OCPs in human blood serum by GC-µECD with good precision.

  3. Electrochemical genoassays on gold-coated magnetic nanoparticles to quantify genetically modified organisms (GMOs) in food and feed as GMO percentage.

    PubMed

    Plácido, Alexandra; Pereira, Clara; Guedes, Alexandra; Barroso, M Fátima; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Delerue-Matos, Cristina

    2018-07-01

    The integration of nanomaterials in the field of (bio)sensors has allowed developing strategies with improved analytical performance. In this work, ultrasmall core-shell Fe 3 O 4 @Au magnetic nanoparticles (MNPs) were used as the platform for the immobilization of event-specific Roundup Ready (RR) soybean and taxon-specific DNA sequences. Firstly, monodisperse Fe 3 O 4 MNPs were synthesized by thermal decomposition and subsequently coated with a gold shell through reduction of Au(III) precursor on the surface of the MNPs in the presence of an organic capping agent. This nanosupport exhibited high colloidal stability, average particle size of 10.2 ± 1.3 nm, and spherical shape. The covalent immobilization of ssDNA probe onto the Au shell of the Fe 3 O 4 @Au MNPs was achieved through a self-assembled monolayer (SAM) created from mixtures of alkane thiols (6-mercapto-1-hexanol and mercaptohexanoic acid). The influence of the thiols ratio on the electrochemical performance of the resulting electrochemical genoassays was studied, and remarkably, the best analytical performance was achieved for a pure mercaptohexanoic acid SAM. Two quantification assays were designed; one targeting an RR sequence and a second targeting a reference soybean gene, both with a sandwich format for hybridization, signaling probes labelled with fluorescein isothiocyanate (FITC), enzymatic amplification and chronoamperometric detection at screen-printed carbon electrodes (SPCE). The magnetogenoassays exhibited linear ranges from 0.1 to 10.0 nM and from 0.1 to 5.0 nM with similar detection limits of 0.02 nM and 0.05 nM for the event-specific (RR) and the taxon-specific (lectin) targets, respectively. The usefulness of the approach was demonstrated by its application to detect genetically modified organisms (GMOs) in feed and food. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Simultaneous Detection of Genetically Modified Organisms in a Mixture by Multiplex PCR-Chip Capillary Electrophoresis.

    PubMed

    Patwardhan, Supriya; Dasari, Srikanth; Bhagavatula, Krishna; Mueller, Steffen; Deepak, Saligrama Adavigowda; Ghosh, Sudip; Basak, Sanjay

    2015-01-01

    An efficient PCR-based method to trace genetically modified food and feed products is in demand due to regulatory requirements and contaminant issues in India. However, post-PCR detection with conventional methods has limited sensitivity in amplicon separation that is crucial in multiplexing. The study aimed to develop a sensitive post-PCR detection method by using PCR-chip capillary electrophoresis (PCR-CCE) to detect and identify specific genetically modified organisms in their genomic DNA mixture by targeting event-specific nucleotide sequences. Using the PCR-CCE approach, novel multiplex methods were developed to detect MON531 cotton, EH 92-527-1 potato, Bt176 maize, GT73 canola, or GA21 maize simultaneously when their genomic DNAs in mixtures were amplified using their primer mixture. The repeatability RSD (RSDr) of the peak migration time was 0.06 and 3.88% for the MON531 and Bt176, respectively. The RSD (RSDR) of the Cry1Ac peak ranged from 0.12 to 0.40% in multiplex methods. The method was sensitive in resolving amplicon of size difference up to 4 bp. The PCR-CCE method is suitable to detect multiple genetically modified events in a composite DNA sample by tagging their event specific sequences.

  5. Simultaneous determination of sixteen metabolites related to neural tube defects in maternal serum by liquid chromatography coupling with electrospray tandem mass spectrometry.

    PubMed

    Liang, Xiao-Ping; Liang, Qiong-Lin; Xia, Jian-Fei; Wang, Yong; Hu, Ping; Wang, Yi-Ming; Zheng, Xiao-Ying; Zhang, Ting; Luo, Guo-An

    2009-06-15

    Disturbances in maternal folate, homocysteine, and glutathione metabolism have been reported to be associated with neural tube defects (NTDs). However, the role played by specific components in the metabolic pathways leading to NTDs remains unclear. Thus an analytical method for simultaneous measurement of sixteen compounds involved in such three metabolic pathways by high performance liquid chromatography-tandem mass spectrometry was developed. The use of hydrophilic chromatography column improved the separation of polar analytes and the detection mode of multiple-reaction monitoring (MRM) enhanced the specificity and sensitivity so as to achieve simultaneous determination of three class of metabolites which have much variance in polarity and contents. The influence of parameters such as temperature, pH, flow rate on the performance of the analytes were studied to get an optimal condition. The method was validated for its linearity, accuracy, and precision, and also used for the analysis of serum samples of NTDs-affected pregnancies and normal women. The result showed that the present method is sensitive and reliable for simultaneous determination of as many as sixteen interesting metabolites which may provide a new means to study the underlying mechanism of NTDs as well as to discover new potential biomarkers.

  6. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  7. Introduction to Food Analysis

    NASA Astrophysics Data System (ADS)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet the legal requirements established by governmental agencies. Government regulations and international standards most relevant to the analysis of foods are mentioned here but covered in more detail in Chap. 2, and nutrition labeling regulations in the USA are covered in Chap. 3. Internet addresses for many of the organizations and government agencies discussed are given at the end of this chapter.

  8. New insights into the 2012 Emilia (Italy) seismic sequence through advanced numerical modeling of ground deformation InSAR measurements

    NASA Astrophysics Data System (ADS)

    Tizzani, P.; Castaldo, R.; Solaro, G.; Pepe, S.; Bonano, M.; Casu, F.; Manunta, M.; Manzo, M.; Pepe, A.; Samsonov, S.; Lanari, R.; Sansosti, E.

    2013-05-01

    We provide new insights into the two main seismic events that occurred in 2012 in the Emilia region, Italy. We extend the results from previous studies based on analytical inversion modeling of GPS and RADARSAT-1 InSAR measurements by exploiting RADARSAT-2 data. Moreover, we benefit from the available large amount of geological and geophysical information through finite element method (FEM) modeling implemented in a structural-mechanical context to investigate the impact of known buried structures on the modulation of the ground deformation field. We find that the displacement pattern associated with the 20 May event is consistent with the activation of a single fault segment of the inner Ferrara thrust, in good agreement with the analytical solution. In contrast, the interpretation of the 29 May episode requires the activation of three different fault segments and a block roto-translation of the Mirandola anticline. The proposed FEM-based methodology is applicable to other seismic areas where the complexity of buried structures is known and plays a fundamental role in the modulation of the associated surface deformation pattern.

  9. Toward improved understanding and control in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hieftje, Gary M.

    1989-01-01

    As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.

  10. Dreams In Jungian Psychology: The use of Dreams as an Instrument For Research, Diagnosis and Treatment of Social Phobia

    PubMed Central

    Khodarahimi, Siamak

    2009-01-01

    Background: The significance of dreams has been explained in psychoanalysis, depth psychology and gestalt therapy. There are many guidelines in analytic psychology for dream interpretation and integration in clinical practice. The present study, based on the Jungian analytic model, incorporated dreams as an instrument for assessment of aetiology, the psychotherapy process and the outcome of treatment for social phobia within a clinical case study. Method: This case study describes the use of dream analysis in treating a female youth with social phobia. Results: The present findings supported the three stage paradigm efficiency in the Jungian model for dream working within a clinical setting, i.e. written details, reassembly with amplification and assimilation. It was indicated that childhood and infantile traumatic events, psychosexual development malfunctions, and inefficient coping skills for solving current life events were expressed in the patient’s dreams. Conclusion: Dreams can reflect a patient’s aetiology, needs, illness prognosis and psychotherapy outcome. Dreams are an instrument for the diagnosis, research and treatment of mental disturbances in a clinical setting. PMID:22135511

  11. On the importance of accounting for competing risks in pediatric cancer trials designed to delay or avoid radiotherapy: I. Basic concepts and first analyses.

    PubMed

    Tai, Bee-Choo; Grundy, Richard G; Machin, David

    2010-04-01

    In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.

  12. Generalised synthesis of space-time variability in flood response: Dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Battista Chirico, Giovanni; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-05-01

    A analytical framework is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  13. Towards Large-scale Twitter Mining for Drug-related Adverse Events.

    PubMed

    Bian, Jiang; Topaloglu, Umit; Yu, Fan

    2012-10-29

    Drug-related adverse events pose substantial risks to patients who consume post-market or Drug-related adverse events pose substantial risks to patients who consume post-market or investigational drugs. Early detection of adverse events benefits not only the drug regulators, but also the manufacturers for pharmacovigilance. Existing methods rely on patients' "spontaneous" self-reports that attest problems. The increasing popularity of social media platforms like the Twitter presents us a new information source for finding potential adverse events. Given the high frequency of user updates, mining Twitter messages can lead us to real-time pharmacovigilance. In this paper, we describe an approach to find drug users and potential adverse events by analyzing the content of twitter messages utilizing Natural Language Processing (NLP) and to build Support Vector Machine (SVM) classifiers. Due to the size nature of the dataset (i.e., 2 billion Tweets), the experiments were conducted on a High Performance Computing (HPC) platform using MapReduce, which exhibits the trend of big data analytics. The results suggest that daily-life social networking data could help early detection of important patient safety issues.

  14. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  15. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-05-15

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  16. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2010-07-13

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  17. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  18. Blade loss transient dynamic analysis of turbomachinery

    NASA Technical Reports Server (NTRS)

    Stallone, M. J.; Gallardo, V.; Storace, A. F.; Bach, L. J.; Black, G.; Gaffney, E. F.

    1982-01-01

    This paper reports on work completed to develop an analytical method for predicting the transient non-linear response of a complete aircraft engine system due to the loss of a fan blade, and to validate the analysis by comparing the results against actual blade loss test data. The solution, which is based on the component element method, accounts for rotor-to-casing rubs, high damping and rapid deceleration rates associated with the blade loss event. A comparison of test results and predicted response show good agreement except for an initial overshoot spike not observed in test. The method is effective for analysis of large systems.

  19. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  20. Behavior dynamics: One perspective

    PubMed Central

    Marr, M. Jackson

    1992-01-01

    Behavior dynamics is a field devoted to analytic descriptions of behavior change. A principal source of both models and methods for these descriptions is found in physics. This approach is an extension of a long conceptual association between behavior analysis and physics. A theme common to both is the role of molar versus molecular events in description and prediction. Similarities and differences in how these events are treated are discussed. Two examples are presented that illustrate possible correspondence between mechanical and behavioral systems. The first demonstrates the use of a mechanical model to describe the molar properties of behavior under changing reinforcement conditions. The second, dealing with some features of concurrent schedules, focuses on the possible utility of nonlinear dynamical systems to the description of both molar and molecular behavioral events as the outcome of a deterministic, but chaotic, process. PMID:16812655

  1. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  2. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features.

    PubMed

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-12-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  3. A Widely Applicable Silver Sol for TLC Detection with Rich and Stable SERS Features

    NASA Astrophysics Data System (ADS)

    Zhu, Qingxia; Li, Hao; Lu, Feng; Chai, Yifeng; Yuan, Yongfang

    2016-04-01

    Thin-layer chromatography (TLC) coupled with surface-enhanced Raman spectroscopy (SERS) has gained tremendous popularity in the study of various complex systems. However, the detection of hydrophobic analytes is difficult, and the specificity still needs to be improved. In this study, a SERS-active non-aqueous silver sol which could activate the analytes to produce rich and stable spectral features was rapidly synthesized. Then, the optimized silver nanoparticles (AgNPs)-DMF sol was employed for TLC-SERS detection of hydrophobic (and also hydrophilic) analytes. SERS performance of this sol was superior to that of traditional Lee-Meisel AgNPs due to its high specificity, acceptable stability, and wide applicability. The non-aqueous AgNPs would be suitable for the TLC-SERS method, which shows great promise for applications in food safety assurance, environmental monitoring, medical diagnoses, and many other fields.

  4. Comparison of bipolar vs. tripolar concentric ring electrode Laplacian estimates.

    PubMed

    Besio, W; Aakula, R; Dai, W

    2004-01-01

    Potentials on the body surface from the heart are of a spatial and temporal function. The 12-lead electrocardiogram (ECG) provides useful global temporal assessment, but it yields limited spatial information due to the smoothing effect caused by the volume conductor. The smoothing complicates identification of multiple simultaneous bioelectrical events. In an attempt to circumvent the smoothing problem, some researchers used a five-point method (FPM) to numerically estimate the analytical solution of the Laplacian with an array of monopolar electrodes. The FPM is generalized to develop a bi-polar concentric ring electrode system. We have developed a new Laplacian ECG sensor, a trielectrode sensor, based on a nine-point method (NPM) numerical approximation of the analytical Laplacian. For a comparison, the NPM, FPM and compact NPM were calculated over a 400 x 400 mesh with 1/400 spacing. Tri and bi-electrode sensors were also simulated and their Laplacian estimates were compared against the analytical Laplacian. We found that tri-electrode sensors have a much-improved accuracy with significantly less relative and maximum errors in estimating the Laplacian operator. Apart from the higher accuracy, our new electrode configuration will allow better localization of the electrical activity of the heart than bi-electrode configurations.

  5. Quantitative determination of BAF312, a S1P-R modulator, in human urine by LC-MS/MS: prevention and recovery of lost analyte due to container surface adsorption.

    PubMed

    Li, Wenkui; Luo, Suyi; Smith, Harold T; Tse, Francis L S

    2010-02-15

    Analyte loss due to non-specific binding, especially container surface adsorption, is not uncommon in the quantitative analysis of urine samples. In developing a sensitive LC-MS/MS method for the determination of a drug candidate, BAF312, in human urine, a simple procedure was outlined for identification, confirmation and prevention of analyte non-specific binding to a container surface and to recover the 'non-specific loss' of an analyte, if no transfer has occurred to the original urine samples. Non-specific binding or container surface adsorption can be quickly identified by using freshly spiked urine calibration standards and pre-pooled QC samples during a LC-MS/MS feasibility run. The resulting low recovery of an analyte in urine samples can be prevented through the use of additives, such as the non-ionic surfactant Tween-80, CHAPS and others, to the container prior to urine sample collection. If the urine samples have not been transferred from the bulk container, the 'non-specific binding' of an analyte to the container surface can be reversed by the addition of a specified amount of CHAPS, Tween-80 or bovine serum albumin, followed by appropriate mixing. Among the above agents, Tween-80 is the most cost-effective. beta-cyclodextrin may be suitable in stabilizing the analyte of interest in urine via pre-treating the matrix with the agent. However, post-addition of beta-cyclodextrin to untreated urine samples does not recover the 'lost' analyte due to non-specific binding or container surface adsorption. In the case of BAF312, a dynamic range of 0.0200-20.0 ng/ml in human urine was validated with an overall accuracy and precision for QC sample results ranging from -3.2 to 5.1% (bias) and 3.9 to 10.2% (CV), respectively. Pre- and post-addition of 0.5% (v/v) Tween-80 to the container provided excellent overall analyte recovery and minimal MS signal suppression when a liquid-liquid extraction in combination with an isocratic LC separation was employed. The compound was stable in 0.5% Tween-80 treated human urine QC samples for at least 24 h at room temperature, after three freeze/thaw cycles with storage at < or =-60 degrees C and for at least 3 months when stored at < or =-60 degrees C. The current work could serve as a simple example in trouble shooting non-specific binding or container surface adsorption in quantitative analysis of urine samples. Copyright 2010. Published by Elsevier B.V.

  6. Behavioral assessment of personality disorders.

    PubMed

    Nelson-Gray, R O; Farmer, R F

    1999-04-01

    This article examines the definition of personality disorders (PDs) from a functional analytical framework and discusses the potential utility of such a framework to account for behavioral tendencies associated with PD pathology. Also reviewed are specific behavioral assessment methods that can be employed in the assessment of PDs, and how information derived from these assessments may be linked to specific intervention strategies.

  7. Multi-analyte method development for analysis of brominated flame retardants (BFRs) and PBDE metabolites in human serum.

    PubMed

    Lu, Dasheng; Jin, Yu'e; Feng, Chao; Wang, Dongli; Lin, Yuanjie; Qiu, Xinlei; Xu, Qian; Wen, Yimin; She, Jianwen; Wang, Guoquan; Zhou, Zhijun

    2017-09-01

    Commonly, analytical methods measuring brominated flame retardants (BFRs) of different chemical polarities in human serum are labor consuming and tedious. Our study used acidified diatomaceous earth as solid-phase extraction (SPE) adsorbent and defatting material to simultaneously determine the most abundant BFRs and their metabolites with different polarities in human serum samples. The analytes include three types of commercial BFRs, tetrabromobisphenol A (TBBPA), hexabromocyclododecane (HBCD) isomers, and polybrominated biphenyl ethers (PBDEs), and dominant hydroxylated BDE (OH-PBDE) and methoxylated BDE (MeO-PBDE) metabolites of PBDEs. The sample eluents were sequentially analyzed for PBDEs and MeO-BDEs on online gel permeation chromatography/gas chromatography-electron capture-negative ionization mass spectrometry (online GPC GC-ECNI-MS) and for TBBPA, HBCD, and OH-BDEs on liquid chromatography-tandem mass spectrometry (LC-MS/MS). Method recoveries were 67-134% with a relative standard deviation (RSD) of less than 20%. Method detection limits (MDLs) were 0.30-4.20 pg/mL fresh weight (f.w.) for all analytes, except for BDE-209 of 16 pg/mL f.w. The methodology was also applied in a pilot study, which analyzed ten real samples from healthy donors in China, and the majority of target analytes were detected with a detection rate of more than 80%. To our knowledge, it is the first time for us in effectively determining BFRs of most types in one aliquot of human serum samples. This new analytical method is more specific, sensitive, accurate, and time saving for routine biomonitoring of these BFRs and for integrated assessment of health risk of BFR exposure.

  8. Particle behavior simulation in thermophoresis phenomena by direct simulation Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wada, Takao

    2014-07-01

    A particle motion considering thermophoretic force is simulated by using direct simulation Monte Carlo (DSMC) method. Thermophoresis phenomena, which occur for a particle size of 1 μm, are treated in this paper. The problem of thermophoresis simulation is computation time which is proportional to the collision frequency. Note that the time step interval becomes much small for the simulation considering the motion of large size particle. Thermophoretic forces calculated by DSMC method were reported, but the particle motion was not computed because of the small time step interval. In this paper, the molecule-particle collision model, which computes the collision between a particle and multi molecules in a collision event, is considered. The momentum transfer to the particle is computed with a collision weight factor, where the collision weight factor means the number of molecules colliding with a particle in a collision event. The large time step interval is adopted by considering the collision weight factor. Furthermore, the large time step interval is about million times longer than the conventional time step interval of the DSMC method when a particle size is 1 μm. Therefore, the computation time becomes about one-millionth. We simulate the graphite particle motion considering thermophoretic force by DSMC-Neutrals (Particle-PLUS neutral module) with above the collision weight factor, where DSMC-Neutrals is commercial software adopting DSMC method. The size and the shape of the particle are 1 μm and a sphere, respectively. The particle-particle collision is ignored. We compute the thermophoretic forces in Ar and H2 gases of a pressure range from 0.1 to 100 mTorr. The results agree well with Gallis' analytical results. Note that Gallis' analytical result for continuum limit is the same as Waldmann's result.

  9. Applicability of rapid and on-site measured enzyme activity for surface water quality monitoring in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Farnleitner, Andreas H.; Sommer, Regina; Kumpan, Monika; Zessner, Matthias

    2014-05-01

    For the near real time and on-site detection of microbiological fecal pollution of water, the measurement of beta-D- Glucuronidase (GLUC) enzymatic activity has been suggested as a surrogate parameter and has been already successfully operated for water quality monitoring of ground water resources (Ryzinska-Paier et al. 2014). Due to possible short measure intervals of three hours, this method has high potential as a water quality monitoring tool. While cultivation based standard determination takes more than one working day (Cabral 2010) the potential advantage of detecting the GLUC activity is the high temporal measuring resolution. Yet, there is still a big gap of knowledge on the fecal indication capacity of GLUC (specificity, sensitivity, persistence, etc.) in relation to potential pollution sources and catchment conditions (Cabral 2010, Ryzinska-Paier et al. 2014). Furthermore surface waters are a big challenge for automated detection devices in a technical point of view due to the high sediment load during event conditions. This presentation shows results gained form two years of monitoring in an experimental catchment (HOAL) dominated by agricultural land use. Two enzymatic measurement devices are operated parallel at the catchment outlet to test the reproducibility and precision of the method. Data from continuous GLUC monitoring under both base flow and event conditions is compared with reference samples analyzed by standardized laboratory methods for fecal pollution detection (e.g. ISO 16649-1, Colilert18). It is shown that rapid enzymatic on-site GLUC determination can successfully be operated from a technical point of view for surface water quality monitoring under the observed catchment conditions. The comparison of enzyme activity with microbiological standard analytics reveals distinct differences in the dynamic of the signals during event conditions. Cabral J. P. S. (2010) "Water Microbiology. Bacterial Pathogens and Water" International Journal of Environmental Research and Public Health 7 (10): 3657-3703. Ryzinska-Paier, G., T. Lendenfeld, K. Correa, P. Stadler, A.P. Blaschke, R. L. Mach, H. Stadler, AKT Kirschner und A.H. Farnleitner (2014) A sensitive and robust method for automated on-line monitoring of enzymatic activities in water and water resources. Water Sci. Technol. in press

  10. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  11. Comparative artificial neural network and partial least squares models for analysis of Metronidazole, Diloxanide, Spiramycin and Cliquinol in pharmaceutical preparations.

    PubMed

    Elkhoudary, Mahmoud M; Abdel Salam, Randa A; Hadad, Ghada M

    2014-09-15

    Metronidazole (MNZ) is a widely used antibacterial and amoebicide drug. Therefore, it is important to develop a rapid and specific analytical method for the determination of MNZ in mixture with Spiramycin (SPY), Diloxanide (DIX) and Cliquinol (CLQ) in pharmaceutical preparations. This work describes simple, sensitive and reliable six multivariate calibration methods, namely linear and nonlinear artificial neural networks preceded by genetic algorithm (GA-ANN) and principle component analysis (PCA-ANN) as well as partial least squares (PLS) either alone or preceded by genetic algorithm (GA-PLS) for UV spectrophotometric determination of MNZ, SPY, DIX and CLQ in pharmaceutical preparations with no interference of pharmaceutical additives. The results manifest the problem of nonlinearity and how models like ANN can handle it. Analytical performance of these methods was statistically validated with respect to linearity, accuracy, precision and specificity. The developed methods indicate the ability of the previously mentioned multivariate calibration models to handle and solve UV spectra of the four components' mixtures using easy and widely used UV spectrophotometer. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Comparative artificial neural network and partial least squares models for analysis of Metronidazole, Diloxanide, Spiramycin and Cliquinol in pharmaceutical preparations

    NASA Astrophysics Data System (ADS)

    Elkhoudary, Mahmoud M.; Abdel Salam, Randa A.; Hadad, Ghada M.

    2014-09-01

    Metronidazole (MNZ) is a widely used antibacterial and amoebicide drug. Therefore, it is important to develop a rapid and specific analytical method for the determination of MNZ in mixture with Spiramycin (SPY), Diloxanide (DIX) and Cliquinol (CLQ) in pharmaceutical preparations. This work describes simple, sensitive and reliable six multivariate calibration methods, namely linear and nonlinear artificial neural networks preceded by genetic algorithm (GA-ANN) and principle component analysis (PCA-ANN) as well as partial least squares (PLS) either alone or preceded by genetic algorithm (GA-PLS) for UV spectrophotometric determination of MNZ, SPY, DIX and CLQ in pharmaceutical preparations with no interference of pharmaceutical additives. The results manifest the problem of nonlinearity and how models like ANN can handle it. Analytical performance of these methods was statistically validated with respect to linearity, accuracy, precision and specificity. The developed methods indicate the ability of the previously mentioned multivariate calibration models to handle and solve UV spectra of the four components’ mixtures using easy and widely used UV spectrophotometer.

  13. Fabrication of microfluidic integrated biosensor

    NASA Astrophysics Data System (ADS)

    Adam, Tijjani; Dhahi, Th S.; Mohammed, Mohammed; Hashim, U.; Noriman, N. Z.; Dahham, Omar S.

    2017-09-01

    An event of miniaturizing for sensor systems to carry out biological diagnostics are gaining wade spread acceptance. The system may contain several different sensor units for the detection of specific analyte, the analyte to be detected might be any kind of biological molecules (DNA, mRNA or proteins) or chemical substances. In most cases, the detection is based on receptor-ligand binding like DNA hybridization or antibody-antigen interaction, achieving this on a nanostructure. DNA or protein must be attached to certain locations within the structure. Critical for this is to have a robust binding chemistry to the surface in the microstructure. Here we successfully designed and fabricated microfluidics element for passive fluid delivery into polysilicon Nanowire sensing domain, we further demonstrated a very simple and effective way of integrating the two devices to give full functionalities of laboratory on a single chip. The sensing element was successfully surface modified and tested on real biomedical clinical sample for evaluation and validation.

  14. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    PubMed

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  15. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  16. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Characterization and measurement of natural gas trace constituents. Volume 1. Arsenic. Final report, June 1989-October 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, S.S.; Attari, A.

    1995-01-01

    The discovery of arsenic compounds, as alkylarsines, in natural gas prompted this research program to develop reliable measurement techniques needed to assess the efficiency of removal processes for these environmentally sensitive substances. These techniques include sampling, speciation, quantitation and on-line instrumental methods for monitoring the total arsenic concentration. The current program has yielded many products, including calibration standards, arsenic-specific sorbents, sensitive analytical methods and instrumentation. Four laboratory analytical methods have been developed and successfully employed for arsenic determination in natural gas. These methods use GC-AED and GC-MS instruments to speciate alkylarsines, and peroxydisulfate extraction with FIAS, special carbon sorbent withmore » XRF and an IGT developed sorbent with GFAA for total arsenic measurement.« less

  18. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  19. Analytic concepts for assessing risk as applied to human space flight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garrick, B J

    Quantitative risk assessment (QRA) principles provide an effective framework for quantifying individual elements of risk, including the risk to astronauts and spacecraft of the radiation environment of space flight. The concept of QRA is based on a structured set of scenarios that could lead to different damage states initiated by either hardware failure, human error, or external events. In the context of a spacecraft risk assessment, radiation may be considered as an external event and analyzed in the same basic way as any other contributor to risk. It is possible to turn up the microscope on any particular contributor tomore » risk and ask more detailed questions than might be necessary to simply assess safety. The methods of QRA allow for as much fine structure in the analysis as is desired. For the purpose of developing a basis for comprehensive risk management and considering the tendency to {open_quotes}fear anything nuclear,{close_quotes} radiation risk is a prime candidate for examination beyond that necessary to answer the basic question of risk. Thus, rather than considering only the customary damage states of fatalities or loss of a spacecraft, it is suggested that the full range of damage be analyzed to quantify radiation risk. Radiation dose levels in the form of a risk curve accomplish such a result. If the risk curve is the complementary cumulative distribution function, then it answers the extended question of what is the likelihood of receiving a specific dose of radiation or greater. Such results can be converted to specific health effects as desired. Knowing the full range of the radiation risk of a space mission and the contributors to that risk provides the information necessary to take risk management actions [operational, design, scheduling of missions around solar particle events (SPE), etc.] that clearly control radiation exposure.« less

  20. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  1. Analytical methods for toxic gases from thermal degradation of polymers

    NASA Technical Reports Server (NTRS)

    Hsu, M.-T. S.

    1977-01-01

    Toxic gases evolved from the thermal oxidative degradation of synthetic or natural polymers in small laboratory chambers or in large scale fire tests are measured by several different analytical methods. Gas detector tubes are used for fast on-site detection of suspect toxic gases. The infrared spectroscopic method is an excellent qualitative and quantitative analysis for some toxic gases. Permanent gases such as carbon monoxide, carbon dioxide, methane and ethylene, can be quantitatively determined by gas chromatography. Highly toxic and corrosive gases such as nitrogen oxides, hydrogen cyanide, hydrogen fluoride, hydrogen chloride and sulfur dioxide should be passed into a scrubbing solution for subsequent analysis by either specific ion electrodes or spectrophotometric methods. Low-concentration toxic organic vapors can be concentrated in a cold trap and then analyzed by gas chromatography and mass spectrometry. The limitations of different methods are discussed.

  2. [The water content reference material of water saturated octanol].

    PubMed

    Wang, Haifeng; Ma, Kang; Zhang, Wei; Li, Zhanyuan

    2011-03-01

    The national standards of biofuels specify the technique specification and analytical methods. A water content certified reference material based on the water saturated octanol was developed in order to satisfy the needs of the instrument calibration and the methods validation, assure the accuracy and consistency of results in water content measurements of biofuels. Three analytical methods based on different theories were employed to certify the water content of the reference material, including Karl Fischer coulometric titration, Karl Fischer volumetric titration and quantitative nuclear magnetic resonance. The consistency of coulometric and volumetric titration was achieved through the improvement of methods. The accuracy of the certified result was improved by the introduction of the new method of quantitative nuclear magnetic resonance. Finally, the certified value of reference material is 4.76% with an expanded uncertainty of 0.09%.

  3. New techniques for the analysis of manual control systems. [mathematical models of human operator behavior

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.

    1971-01-01

    Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.

  4. Event-specific plasmid standards and real-time PCR methods for transgenic Bt11, Bt176, and GA21 maize and transgenic GT73 canola.

    PubMed

    Taverniers, Isabel; Windels, Pieter; Vaïtilingom, Marc; Milcamps, Anne; Van Bockstaele, Erik; Van den Eede, Guy; De Loose, Marc

    2005-04-20

    Since the 18th of April 2004, two new regulations, EC/1829/2003 on genetically modified food and feed products and EC/1830/2003 on traceability and labeling of GMOs, are in force in the EU. This new, comprehensive regulatory framework emphasizes the need of an adequate tracing system. Unique identifiers, such as the transgene genome junction region or a specific rearrangement within the transgene DNA, should form the basis of such a tracing system. In this study, we describe the development of event-specific tracing systems for transgenic maize lines Bt11, Bt176, and GA21 and for canola event GT73. Molecular characterization of the transgene loci enabled us to clone an event-specific sequence into a plasmid vector, to be used as a marker, and to develop line-specific primers. Primer specificity was tested through qualitative PCRs and dissociation curve analysis in SYBR Green I real-time PCRs. The primers were then combined with event-specific TaqMan probes in quantitative real-time PCRs. Calibration curves were set up both with genomic DNA samples and the newly synthesized plasmid DNA markers. It is shown that cloned plasmid GMO target sequences are perfectly suitable as unique identifiers and quantitative calibrators. Together with an event-specific primer pair and a highly specific TaqMan probe, the plasmid markers form crucial components of a unique and straighforward tracing system for Bt11, Bt176, and GA21 maize and GT73 canola events.

  5. Re-inventing prevention? - An evaluation of tools for strengthening private preparedness for floods and heavy rains

    NASA Astrophysics Data System (ADS)

    Rohland, Stefanie; Pfurtscheller, Clemens; Seebauer, Sebastian

    2016-04-01

    Keywords: private preparedness, property protection, flood, heavy rains, Transtheoretical Model, evaluation of methods and tools Experiences in Europe and Austria from coping with numerous floods and heavy rain events in recent decades point to room for improvement in reducing damages and adverse effects. One of the emerging issues is private preparedness, which has only received punctual attention in Austria until now. Current activities to promote property protection are, however, not underpinned by a long-term strategy, thus minimizing their cumulative effect. While printed brochures and online information are widely available, innovative information services, tailored to and actively addressing specific target groups, are thin on the ground. This project reviews (national as well as international) established approaches, with a focus on German-speaking areas, checking their long-term effectiveness with the help of expert workshops and an empirical analysis of survey data. The Transtheoretical Model (Prochaska, 1977) serves as the analytical framework: We assign specific tools to distinct stages of behavioural change. People's openness to absorb risk information or their willingness to engage in private preparedness depend on an incremental process of considering, appraising, introducing and finally maintaining preventive actions. Based on this stage-specific perspective and the workshop results, gaps of intervention are identified to define best-practice examples and recommendations that can be realized within the prevailing legislative and organizational framework at national, regional and local level in Austria.

  6. Simultaneous detection of seventeen drugs of abuse and metabolites in hair using solid phase micro extraction (SPME) with GC/MS.

    PubMed

    Aleksa, Katarina; Walasek, Paula; Fulga, Netta; Kapur, Bhushan; Gareri, Joey; Koren, Gideon

    2012-05-10

    The analysis of pediatric and adult hair is a useful non-invasive biomarker to effectively detect long term exposure to various xenobiotics, specifically drugs of abuse such as cocaine, opiates and amphetamines. Very often individuals are using, or are exposed to multiple drugs simultaneously and therefore it is important to be able to detect them in the same analysis. We have developed a sensitive and specific solid phase micro extraction (SPME) coupled with gas chromatography mass spectrometry (GC/MS) to detect 17 different analytes in hair using a single extraction method. Five milligrams of hair is extracted overnight, subjected to solid phase extraction (SPE) and then to SPME-GC/MS. The aimed analytes include amphetamine, methamphetamine, MDA, MDMA, cocaine, benzoylecognine, norcocaine, cocaethylene, methadone, codeine, morphine, 6-AM, oxycodone, oxymorphone, hydrocodone, hydromorphone and meperidone. The following are the LOD of the various drugs: 0.2ng/mg hair for amphetamine, methamphetamine, MDA, MDMA, morphine, codeine, 6-AM, oxycodone, oxymorphone, hydromorphone, hydrocodone, meperidine and 0.13ng/mg hair for cocaine, benzoylecognine, cocaethylene, norcocaine and methadone. This GC/MS method is sensitive and specific to detect the presence of these 17 analytes in as little as 5mg of hair and is especially useful for newborn and child hair analysis where the amount of hair is often very limited. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Modelling the spatial distribution of five natural hazards in the context of the WHO/EMRO Atlas of Disaster Risk as a step towards the reduction of the health impact related to disasters

    PubMed Central

    El Morjani, Zine El Abidine; Ebener, Steeve; Boos, John; Abdel Ghaffar, Eman; Musani, Altaf

    2007-01-01

    Background Reducing the potential for large scale loss of life, large numbers of casualties, and widespread displacement of populations that can result from natural disasters is a difficult challenge for the individuals, communities and governments that need to respond to such events. While it is extremely difficult, if not impossible, to predict the occurrence of most natural hazards; it is possible to take action before emergency events happen to plan for their occurrence when possible and to mitigate their potential effects. In this context, an Atlas of Disaster Risk is under development for the 21 Member States that constitute the World Health Organization's (WHO) Eastern Mediterranean (EM) Region and the West Bank and Gaza Strip territory. Methods and Results This paper describes the Geographic Information System (GIS) based methods that have been used in order to create the first volume of the Atlas which looks at the spatial distribution of 5 natural hazards (flood, landslide, wind speed, heat and seismic hazard). It also presents the results obtained through the application of these methods on a set of countries part of the EM Region before illustrating how this type of information can be aggregated for decision making. Discussion and Conclusion The methods presented in this paper aim at providing a new set of tools for GIS practitioners to refine their analytical capabilities when examining natural hazards, and at the same time allowing users to create more specific and meaningful local analyses. The maps resulting from the application of these methods provides decision makers with information to strengthen their disaster management capacity. It also represents the basis for the reflection that needs to take place regarding populations' vulnerability towards natural hazards from a health perspective. PMID:17343733

  8. Development of an event-specific hydrolysis probe quantitative real-time polymerase chain reaction assay for Embrapa 5.1 genetically modified common bean (Phaseolus vulgaris).

    PubMed

    Treml, Diana; Venturelli, Gustavo L; Brod, Fábio C A; Faria, Josias C; Arisi, Ana C M

    2014-12-10

    A genetically modified (GM) common bean event, namely Embrapa 5.1, resistant to the bean golden mosaic virus (BGMV), was approved for commercialization in Brazil. Brazilian regulation for genetically modified organism (GMO) labeling requires that any food containing more than 1% GMO be labeled. The event-specific polymerase chain reaction (PCR) method has been the primary trend for GMO identification and quantitation because of its high specificity based on the flanking sequence. This work reports the development of an event-specific assay, named FGM, for Embrapa 5.1 detection and quantitation by use of SYBR Green or hydrolysis probe. The FGM assay specificity was tested for Embrapa 2.3 event (a noncommercial GM common bean also resistant to BGMV), 46 non-GM common bean varieties, and other crop species including maize, GM maize, soybean, and GM soybean. The FGM assay showed high specificity to detect the Embrapa 5.1 event. Standard curves for the FGM assay presented a mean efficiency of 95% and a limit of detection (LOD) of 100 genome copies in the presence of background DNA. The primers and probe developed are suitable for the detection and quantitation of Embrapa 5.1.

  9. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    PubMed

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. Analytical assessment of some characteristic ratios for s-wave superconductors

    NASA Astrophysics Data System (ADS)

    Gonczarek, Ryszard; Krzyzosiak, Mateusz; Gonczarek, Adam; Jacak, Lucjan

    2018-04-01

    We evaluate some thermodynamic quantities and characteristic ratios that describe low- and high-temperature s-wave superconducting systems. Based on a set of fundamental equations derived within the conformal transformation method, a simple model is proposed and studied analytically. After including a one-parameter class of fluctuations in the density of states, the mathematical structure of the s-wave superconducting gap, the free energy difference, and the specific heat difference is found and discussed in an analytic manner. Both the zero-temperature limit T = 0 and the subcritical temperature range T ≲ T c are discussed using the method of successive approximations. The equation for the ratio R 1, relating the zero-temperature energy gap and the critical temperature, is formulated and solved numerically for various values of the model parameter. Other thermodynamic quantities are analyzed, including a characteristic ratio R 2, quantifying the dynamics of the specific heat jump at the critical temperature. It is shown that the obtained model results coincide with experimental data for low- T c superconductors. The prospect of application of the presented model in studies of high- T c superconductors and other superconducting systems of the new generation is also discussed.

  11. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    PubMed

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. A transient laboratory method for determining the hydraulic properties of 'tight' rocks-I. Theory

    USGS Publications Warehouse

    Hsieh, P.A.; Tracy, J.V.; Neuzil, C.E.; Bredehoeft, J.D.; Silliman, Stephen E.

    1981-01-01

    Transient pulse testing has been employed increasingly in the laboratory to measure the hydraulic properties of rock samples with low permeability. Several investigators have proposed a mathematical model in terms of an initial-boundary value problem to describe fluid flow in a transient pulse test. However, the solution of this problem has not been available. In analyzing data from the transient pulse test, previous investigators have either employed analytical solutions that are derived with the use of additional, restrictive assumptions, or have resorted to numerical methods. In Part I of this paper, a general, analytical solution for the transient pulse test is presented. This solution is graphically illustrated by plots of dimensionless variables for several cases of interest. The solution is shown to contain, as limiting cases, the more restrictive analytical solutions that the previous investigators have derived. A method of computing both the permeability and specific storage of the test sample from experimental data will be presented in Part II. ?? 1981.

  13. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    2017-05-08

    Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  14. Locating bomb factories by detecting hydrogen peroxide.

    PubMed

    Romolo, Francesco Saverio; Connell, Samantha; Ferrari, Carlotta; Suarez, Guillaume; Sauvain, Jean-Jacques; Hopf, Nancy B

    2016-11-01

    The analytical capability to detect hydrogen peroxide vapour can play a key role in localizing a site where a H2O2 based Improvised Explosive (IE) is manufactured. In security activities it is very important to obtain information in a short time. For this reason, an analytical method to be used in security activity needs portable devices. The authors have developed the first analytical method based on a portable luminometer, specifically designed and validated to locate IE manufacturing sites using quantitative on-site vapour analysis for H2O2. The method was tested both indoor and outdoor. The results demonstrate that the detection of H2O2 vapours could allow police forces to locate the site, while terrorists are preparing an attack. The collected data are also very important in developing new sensors, able to give an early alarm if located at a proper distance from a site where an H2O2 based IE is prepared. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Analysis of Environmental Contamination resulting from ...

    EPA Pesticide Factsheets

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to safe levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illu

  16. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    PubMed

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  17. Evaluation of analytical methodology for hydrocarbons in high pressure air and nitrogen systems. [evaluation of methodology

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Samples of liquid oxygen, high pressure nitrogen, low pressure nitrogen, and missile grade air were studied to determine the hydrocarbon concentrations. Concentration of the samples was achieved by adsorption on a molecular sieve and activated charcoal. The trapped hydrocarbons were then desorbed and transferred to an analytical column in a gas chromatograph. The sensitivity of the method depends on the volume of gas passed through the adsorbent tubes. The value of the method was verified through recoverability and reproducibility studies. The use of this method enables LOX, GN2, and missile grade air systems to be routinely monitored to determine low level increases in specific hydrocarbon concentration that could lead to potentially hazardous conditions.

  18. Interplanetary Program to Optimize Simulated Trajectories (IPOST). Volume 2: Analytic manual

    NASA Technical Reports Server (NTRS)

    Hong, P. E.; Kent, P. D.; Olson, D. W.; Vallado, C. A.

    1992-01-01

    The Interplanetary Program to Optimize Space Trajectories (IPOST) is intended to support many analysis phases, from early interplanetary feasibility studies through spacecraft development and operations. The IPOST output provides information for sizing and understanding mission impacts related to propulsion, guidance, communications, sensor/actuators, payload, and other dynamic and geometric environments. IPOST models three degree of freedom trajectory events, such as launch/ascent, orbital coast, propulsive maneuvering (impulsive and finite burn), gravity assist, and atmospheric entry. Trajectory propagation is performed using a choice of Cowell, Encke, Multiconic, Onestep, or Conic methods. The user identifies a desired sequence of trajectory events, and selects which parameters are independent (controls) and dependent (targets), as well as other constraints and the cost function. Targeting and optimization is performed using the Stanford NPSOL algorithm. IPOST structure allows subproblems within a master optimization problem to aid in the general constrained parameter optimization solution. An alternate optimization method uses implicit simulation and collocation techniques.

  19. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    NASA Astrophysics Data System (ADS)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  20. A multiplex PCR assay for the rapid and sensitive detection of methicillin-resistant Staphylococcus aureus and simultaneous discrimination of Staphylococcus aureus from coagulase-negative staphylococci.

    PubMed

    Xu, Benjin; Liu, Ling; Liu, Li; Li, Xinping; Li, Xiaofang; Wang, Xin

    2012-11-01

    Methicillin-resistant Staphylococcus aureus (MRSA) is a global health concern, which had been detected in food and food production animals. Conventional testing for detection of MRSA takes 3 to 5 d to yield complete information of the organism and its antibiotic sensitivity pattern. So, a rapid method is needed to diagnose and treat the MRSA infections. The present study focused on the development of a multiplex PCR assay for the rapid and sensitive detection of MRSA. The assay simultaneously detected 4 genes, namely, 16S rRNA of the Staphylococcus genus, femA of S. aureus, mecA that encodes methicillin resistance, and one internal control. It was rapid and yielded results within 4 h. The analytical sensitivity and specificity of the multiplex PCR assay was evaluated by comparing it with the conventional method. The analytical sensitivity of the multiplex PCR assay at the DNA level was 10 ng DNA. The analytical specificity was evaluated with 10 reference staphylococci strains and was 100%. The diagnostic evaluation of MRSA was carried out using 360 foodborne staphylococci isolates, and showed 99.1% of specificity, 96.4% of sensitivity, 97.5% of positive predictive value, and 97.3% of negative predictive value compared to the conventional method. The inclusion of an internal control in the multiplex PCR assay is important to exclude false-negative cases. This test can be used as an effective diagnostic and surveillance tool to investigate the spread and emergence of MRSA. © 2012 Institute of Food Technologists®

  1. Application of techniques to identify coal-mine and power-generation effects on surface-water quality, San Juan River basin, New Mexico and Colorado

    USGS Publications Warehouse

    Goetz, C.L.; Abeyta, Cynthia G.; Thomas, E.V.

    1987-01-01

    Numerous analytical techniques were applied to determine water quality changes in the San Juan River basin upstream of Shiprock , New Mexico. Eight techniques were used to analyze hydrologic data such as: precipitation, water quality, and streamflow. The eight methods used are: (1) Piper diagram, (2) time-series plot, (3) frequency distribution, (4) box-and-whisker plot, (5) seasonal Kendall test, (6) Wilcoxon rank-sum test, (7) SEASRS procedure, and (8) analysis of flow adjusted, specific conductance data and smoothing. Post-1963 changes in dissolved solids concentration, dissolved potassium concentration, specific conductance, suspended sediment concentration, or suspended sediment load in the San Juan River downstream from the surface coal mines were examined to determine if coal mining was having an effect on the quality of surface water. None of the analytical methods used to analyzed the data showed any increase in dissolved solids concentration, dissolved potassium concentration, or specific conductance in the river downstream from the mines; some of the analytical methods used showed a decrease in dissolved solids concentration and specific conductance. Chaco River, an ephemeral stream tributary to the San Juan River, undergoes changes in water quality due to effluent from a power generation facility. The discharge in the Chaco River contributes about 1.9% of the average annual discharge at the downstream station, San Juan River at Shiprock, NM. The changes in water quality detected at the Chaco River station were not detected at the downstream Shiprock station. It was not possible, with the available data, to identify any effects of the surface coal mines on water quality that were separable from those of urbanization, agriculture, and other cultural and natural changes. In order to determine the specific causes of changes in water quality, it would be necessary to collect additional data at strategically located stations. (Author 's abstract)

  2. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  3. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  4. Groundwater Evapotranspiration from Diurnal Water Table Fluctuation: a Modified White Based Method Using Drainable and Fillable Porosity

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Mylavarapu, R.; Jawitz, J. W.

    2012-12-01

    In shallow unconfined aquifers, the water table usually shows a distinct diurnal fluctuation pattern corresponding to the twenty-four hour solar radiation cycle. This diurnal water table fluctuation (DWTF) signal can be used to estimate the groundwater evapotranspiration (ETg) by vegetation, a method known as the White [1932] method. Water table fluctuations in shallow phreatic aquifers is controlled by two distinct storage parameters, drainable porosity (or specific yield) and the fillable porosity. Yet, it is implicitly assumed in most studies that these two parameters are equal, unless hysteresis effect is considered. The White based method available in the literature is also based on a single drainable porosity parameter to estimate the ETg. In this study, we present a modification of the White based method to estimate ETg from DWTF using separate drainable (λd) and fillable porosity (λf) parameters. Separate analytical expressions based on successive steady state moisture profiles are used to estimate λd and λf, instead of the commonly employed hydrostatic moisture profile approach. The modified method is then applied to estimate ETg using the DWTF data observed in a field in northeast Florida and the results are compared with ET estimations from the standard Penman-Monteith equation. It is found that the modified method resulted in significantly better estimates of ETg than the previously available method that used only a single, hydrostatic-moisture-profile based λd. Furthermore, the modified method is also used to estimate ETg even during rainfall events which produced significantly better estimates of ETg as compared to the single λd parameter method.

  5. Big Data Tools as Applied to ATLAS Event Data

    NASA Astrophysics Data System (ADS)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.

  6. Analytic calculation of 1-jettiness in DIS at O (α s)

    DOE PAGES

    Kang, Daekyoung; Lee, Christopher; Stewart, Iain W.

    2014-11-01

    We present an analytic O(α s) calculation of cross sections in deep inelastic scattering (DIS) dependent on an event shape, 1-jettiness, that probes final states with one jet plus initial state radiation. This is the first entirely analytic calculation for a DIS event shape cross section at this order. We present results for the differential and cumulative 1-jettiness cross sections, and express both in terms of structure functions dependent not only on the usual DIS variables x, Q 2 but also on the 1-jettiness τ. Combined with previous results for log resummation, predictions are obtained over the entire range ofmore » the 1-jettiness distribution.« less

  7. Distribution of correlated spiking events in a population-based approach for Integrate-and-Fire networks.

    PubMed

    Zhang, Jiwei; Newhall, Katherine; Zhou, Douglas; Rangan, Aaditya

    2014-04-01

    Randomly connected populations of spiking neurons display a rich variety of dynamics. However, much of the current modeling and theoretical work has focused on two dynamical extremes: on one hand homogeneous dynamics characterized by weak correlations between neurons, and on the other hand total synchrony characterized by large populations firing in unison. In this paper we address the conceptual issue of how to mathematically characterize the partially synchronous "multiple firing events" (MFEs) which manifest in between these two dynamical extremes. We further develop a geometric method for obtaining the distribution of magnitudes of these MFEs by recasting the cascading firing event process as a first-passage time problem, and deriving an analytical approximation of the first passage time density valid for large neuron populations. Thus, we establish a direct link between the voltage distributions of excitatory and inhibitory neurons and the number of neurons firing in an MFE that can be easily integrated into population-based computational methods, thereby bridging the gap between homogeneous firing regimes and total synchrony.

  8. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  9. Development of NIRS method for quality control of drug combination artesunate–azithromycin for the treatment of severe malaria

    PubMed Central

    Boyer, Chantal; Gaudin, Karen; Kauss, Tina; Gaubert, Alexandra; Boudis, Abdelhakim; Verschelden, Justine; Franc, Mickaël; Roussille, Julie; Boucher, Jacques; Olliaro, Piero; White, Nicholas J.; Millet, Pascal; Dubost, Jean-Pierre

    2012-01-01

    Near infrared spectroscopy (NIRS) methods were developed for the determination of analytical content of an antimalarial-antibiotic (artesunate and azithromycin) co-formulation in hard gelatin capsule (HGC). The NIRS consists of pre-processing treatment of spectra (raw spectra and first-derivation of two spectral zones), a unique principal component analysis model to ensure the specificity and then two partial least-squares regression models for the determination content of each active pharmaceutical ingredient. The NIRS methods were developed and validated with no reference method, since the manufacturing process of HGC is basically mixed excipients with active pharmaceutical ingredients. The accuracy profiles showed β-expectation tolerance limits within the acceptance limits (±5%). The analytical control approach performed by reversed phase (HPLC) required two different methods involving two different preparation and chromatographic methods. NIRS offers advantages in terms of lower costs of equipment and procedures, time saving, environmentally friendly. PMID:22579599

  10. Simultaneous quantification of fentanyl, sufentanil, cefazolin, doxapram and keto-doxapram in plasma using liquid chromatography - tandem mass spectrometry.

    PubMed

    Flint, Robert B; Bahmany, Soma; van der Nagel, Bart C H; Koch, Birgit C P

    2018-05-16

    A simple and specific UPLC-MS/MS method was developed and validated for simultaneous quantification of fentanyl, sufentanil, cefazolin, doxapram and its active metabolite keto-doxapram. The internal standard was fentanyl-d5 for all analytes. Chromatographic separation was achieved with a reversed phase Acquity UPLC HSS T3 column with a run-time of only 5.0 minutes per injected sample. Gradient elution was performed with a mobile phase consisting of ammonium acetate, formic acid in Milli-Q ultrapure water or in methanol with a total flow rate of 0.4 mL minute -1 . A plasma volume of only 50 μL was required to achieve both adequate accuracy and precision. Calibration curves of all 5 analytes were linear. All analytes were stable for at least 48 hours in the autosampler. The method was validated according to US Food and Drug Administration guidelines. This method allows quantification of fentanyl, sufentanil, cefazolin, doxapram and keto-doxapram, which serves purposes for research, as well as therapeutic drug monitoring, if applicable. The strength of this method is the combination of a small sample volume, a short run-time, a deuterated internal standard, an easy sample preparation method and the ability to simultaneously quantify all analytes in one run. This article is protected by copyright. All rights reserved.

  11. Quantitative PCR for Genetic Markers of Human Fecal Pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...

  12. EPA Region 6 Laboratory Method Specific Analytical Capabilities with Sample Concentration Range

    EPA Pesticide Factsheets

    EPA Region 6 Environmental Services Branch (ESB) Laboratory is capable of analyzing a wide range of samples with concentrations ranging for low part-per trillion (ppt) to low percent () levels, depending on the sample matrix.

  13. Selection and authentication of botanical materials for the development of analytical methods.

    PubMed

    Applequist, Wendy L; Miller, James S

    2013-05-01

    Herbal products, for example botanical dietary supplements, are widely used. Analytical methods are needed to ensure that botanical ingredients used in commercial products are correctly identified and that research materials are of adequate quality and are sufficiently characterized to enable research to be interpreted and replicated. Adulteration of botanical material in commerce is common for some species. The development of analytical methods for specific botanicals, and accurate reporting of research results, depend critically on correct identification of test materials. Conscious efforts must therefore be made to ensure that the botanical identity of test materials is rigorously confirmed and documented through preservation of vouchers, and that their geographic origin and handling are appropriate. Use of material with an associated herbarium voucher that can be botanically identified is always ideal. Indirect methods of authenticating bulk material in commerce, for example use of organoleptic, anatomical, chemical, or molecular characteristics, are not always acceptable for the chemist's purposes. Familiarity with botanical and pharmacognostic literature is necessary to determine what potential adulterants exist and how they may be distinguished.

  14. Dental and dental hygiene students' diagnostic accuracy in oral radiology: effect of diagnostic strategy and instructional method.

    PubMed

    Baghdady, Mariam T; Carnahan, Heather; Lam, Ernest W N; Woods, Nicole N

    2014-09-01

    There has been much debate surrounding diagnostic strategies and the most appropriate training models for novices in oral radiology. It has been argued that an analytic approach, using a step-by-step analysis of the radiographic features of an abnormality, is ideal. Alternative research suggests that novices can successfully employ non-analytic reasoning. Many of these studies do not take instructional methodology into account. This study evaluated the effectiveness of non-analytic and analytic strategies in radiographic interpretation and explored the relationship between instructional methodology and diagnostic strategy. Second-year dental and dental hygiene students were taught four radiographic abnormalities using basic science instructions or a step-by-step algorithm. The students were tested on diagnostic accuracy and memory immediately after learning and one week later. A total of seventy-three students completed both immediate and delayed sessions and were included in the analysis. Students were randomly divided into two instructional conditions: one group provided a diagnostic hypothesis for the image and then identified specific features to support it, while the other group first identified features and then provided a diagnosis. Participants in the diagnosis-first condition (non-analytic reasoning) had higher diagnostic accuracy then those in the features-first condition (analytic reasoning), regardless of their learning condition. No main effect of learning condition or interaction with diagnostic strategy was observed. Educators should be mindful of the potential influence of analytic and non-analytic approaches on the effectiveness of the instructional method.

  15. Extension of rezoned Eulerian-Lagrangian method to astrophysical plasma applications

    NASA Technical Reports Server (NTRS)

    Song, M. T.; Wu, S. T.; Dryer, Murray

    1993-01-01

    The rezoned Eulerian-Lagrangian procedure developed by Brackbill and Pracht (1973), which is limited to simple configurations of the magnetic fields, is modified in order to make it applicable to astrophysical plasma. For this purpose, two specific methods are introduced, which make it possible to determine the initial field topology for which no analytical expressions are available. Numerical examples illustrating these methods are presented.

  16. A High-Throughput UHPLC-MS/MS Method for the Quantification of Five Aged Butyrylcholinesterase Biomarkers from Human Exposure to Organophosphorus Nerve Agents

    PubMed Central

    Graham, Leigh Ann; Johnson, Darryl; Carter, Melissa D.; Stout, Emily G.; Erol, Huseyin A.; Isenberg, Samantha L.; Mathews, Thomas P.; Thomas, Jerry D.; Johnson, Rudolph C.

    2017-01-01

    Organophosphorus nerve agents (OPNAs) are toxic compounds that are classified as prohibited Schedule 1 chemical weapons. In the body, OPNAs bind to butyrylcholinesterase (BChE) to form nerve agent adducts (OPNA-BChE). OPNA-BChE adducts can provide a reliable, long-term protein biomarker for assessing human exposure. A major challenge facing OPNA-BChE detection is hydrolysis (aging), which can continue to occur after a clinical specimen has been collected. During aging, the o-alkyl phosphoester bond hydrolyzes, and the specific identity of the nerve agent is lost. To better identify OPNA exposure events, a high throughput method for the detection of five aged OPNA-BChE adducts was developed. This is the first diagnostic panel to allow for the simultaneous quantification of any Chemical Weapons Convention Schedule 1 OPNA by measuring the aged adducts methyl phosphonate (MeP-BChE), ethyl phosphonate (EtP-BChE), propyl phosphonate (PrP-BChE), ethyl phosphoryl (ExP-BChE), phosphoryl (P-BChE), and unadducted BChE. The calibration range for all analytes is 2.00 – 250. ng/mL, which is consistent with similar methodologies used to detect unaged OPNA-BChE adducts. Each analytical run is three minutes making the time to first unknown results, including calibration curve and quality controls, less than one hour. Analysis of commercially purchased individual serum samples demonstrated no potential interferences with detection of aged OPNA-BChE adducts, and quantitative measurements of endogenous levels of BChE were similar to those previously reported in other OPNA-BChE adduct assays. PMID:27572107

  17. Mixing of two co-directional Rayleigh surface waves in a nonlinear elastic material.

    PubMed

    Morlock, Merlin B; Kim, Jin-Yeon; Jacobs, Laurence J; Qu, Jianmin

    2015-01-01

    The mixing of two co-directional, initially monochromatic Rayleigh surface waves in an isotropic, homogeneous, and nonlinear elastic solid is investigated using analytical, finite element method, and experimental approaches. The analytical investigations show that while the horizontal velocity component can form a shock wave, the vertical velocity component can form a pulse independent of the specific ratios of the fundamental frequencies and amplitudes that are mixed. This analytical model is then used to simulate the development of the fundamentals, second harmonics, and the sum and difference frequency components over the propagation distance. The analytical model is further extended to include diffraction effects in the parabolic approximation. Finally, the frequency and amplitude ratios of the fundamentals are identified which provide maximum amplitudes of the second harmonics as well as of the sum and difference frequency components, to help guide effective material characterization; this approach should make it possible to measure the acoustic nonlinearity of a solid not only with the second harmonics, but also with the sum and difference frequency components. Results of the analytical investigations are then confirmed using the finite element method and the experimental feasibility of the proposed technique is validated for an aluminum specimen.

  18. Effect of Microscopic Damage Events on Static and Ballistic Impact Strength of Triaxial Braid Composites

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.

    2010-01-01

    The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.

  19. Near Real-Time Flood Monitoring and Impact Assessment Systems. Chapter 6; [Case Study: 2011 Flooding in Southeast Asia

    NASA Technical Reports Server (NTRS)

    Ahamed, Aakash; Bolten, John; Doyle, Colin; Fayne, Jessica

    2016-01-01

    Floods are the costliest natural disaster, causing approximately 6.8 million deaths in the twentieth century alone. Worldwide economic flood damage estimates in 2012 exceed $19 Billion USD. Extended duration floods also pose longer term threats to food security, water, sanitation, hygiene, and community livelihoods, particularly in developing countries. Projections by the Intergovernmental Panel on Climate Change (IPCC) suggest that precipitation extremes, rainfall intensity, storm intensity, and variability are increasing due to climate change. Increasing hydrologic uncertainty will likely lead to unprecedented extreme flood events. As such, there is a vital need to enhance and further develop traditional techniques used to rapidly assess flooding and extend analytical methods to estimate impacted population and infrastructure. Measuring flood extent in situ is generally impractical, time consuming, and can be inaccurate. Remotely sensed imagery acquired from space-borne and airborne sensors provides a viable platform for consistent and rapid wall-to-wall monitoring of large flood events through time. Terabytes of freely available satellite imagery are made available online each day by NASA, ESA, and other international space research institutions. Advances in cloud computing and data storage technologies allow researchers to leverage these satellite data and apply analytical methods at scale. Repeat-survey earth observations help provide insight about how natural phenomena change through time, including the progression and recession of floodwaters. In recent years, cloud-penetrating radar remote sensing techniques (e.g., Synthetic Aperture Radar) and high temporal resolution imagery platforms (e.g., MODIS and its 1-day return period), along with high performance computing infrastructure, have enabled significant advances in software systems that provide flood warning, assessments, and hazard reduction potential. By incorporating social and economic data, researchers can develop systems that automatically quantify the socioeconomic impacts resulting from flood disaster events.

  20. Establishing pediatric reference intervals for 13 biochemical analytes derived from normal subjects in a pediatric endocrinology clinic in Korea.

    PubMed

    Cho, Sun-Mi; Lee, Sang-Guk; Kim, Ho Seong; Kim, Jeong-Ho

    2014-12-01

    Defining pediatric reference intervals is one of the most difficult tasks for laboratory physicians. The continuously changing physiology of growing children makes their laboratory values moving targets. In addition, ethnic and behavioral differences might also cause variations. The aim of this study was to establish age- and sex-specific partitioned reference intervals for 13 serum biochemical analytes in Korean children. A total of 2474 patients, girls aged 2-14 years and boys aged 2-16 years, who underwent a short stature workup but were diagnosed as normal at the Pediatric Endocrinology Clinic of Severance Hospital (Seoul, Korea) between September 2010 and June 2012 were included in this study. The levels of serum calcium, inorganic phosphorus, blood urea nitrogen, creatinine, uric acid, glucose, total cholesterol, total protein, albumin, alkaline phosphatase, aspartic aminotransferase, alanine aminotransferase, and total bilirubin were measured using a Hitachi 7600 analyzer (Hitachi High-Technologies Corporation, Tokyo, Japan). Reference intervals were partitioned according to sex or age subgroups using the Harris and Boyd method. Most analytes except calcium and albumin required partitioning either by sex or age. Age-specific partitioned reference intervals for alkaline phosphatase, creatinine, and total bilirubin were established for both males and females after being partitioned by sex. Additional age-specific partitioning of aspartic aminotransferase in females and total protein and uric acid in males was also required. Inorganic phosphorus, total cholesterol, alanine aminotransferase, blood urea nitrogen, and glucose were partitioned only by sex. This study provided updated age- and sex-specific pediatric reference intervals for 13 basic serum chemistry analytes from a sufficient number of healthy children by using a modern analytical chemistry platform. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Deep Sludge Gas Release Event Analytical Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sams, Terry L.

    2013-08-15

    Long Abstract. Full Text. The purpose of the Deep Sludge Gas Release Event Analytical Evaluation (DSGRE-AE) is to evaluate the postulated hypothesis that a hydrogen GRE may occur in Hanford tanks containing waste sludges at levels greater than previously experienced. There is a need to understand gas retention and release hazards in sludge beds which are 200 -300 inches deep. These sludge beds are deeper than historical Hanford sludge waste beds, and are created when waste is retrieved from older single-shell tanks (SST) and transferred to newer double-shell tanks (DST).Retrieval of waste from SSTs reduces the risk to the environmentmore » from leakage or potential leakage of waste into the ground from these tanks. However, the possibility of an energetic event (flammable gas accident) in the retrieval receiver DST is worse than slow leakage. Lines of inquiry, therefore, are (1) can sludge waste be stored safely in deep beds; (2) can gas release events (GRE) be prevented by periodically degassing the sludge (e.g., mixer pump); or (3) does the retrieval strategy need to be altered to limit sludge bed height by retrieving into additional DSTs? The scope of this effort is to provide expert advice on whether or not to move forward with the generation of deep beds of sludge through retrieval of C-Farm tanks. Evaluation of possible mitigation methods (e.g., using mixer pumps to release gas, retrieving into an additional DST) are being evaluated by a second team and are not discussed in this report. While available data and engineering judgment indicate that increased gas retention (retained gas fraction) in DST sludge at depths resulting from the completion of SST 241-C Tank Farm retrievals is not expected and, even if gas releases were to occur, they would be small and local, a positive USQ was declared (Occurrence Report EM-RP--WRPS-TANKFARM-2012-0014, "Potential Exists for a Large Spontaneous Gas Release Event in Deep Settled Waste Sludge"). The purpose of this technical report is to (1) present and discuss current understandings of gas retention and release mechanisms for deep sludge in U.S. Department of Energy (DOE) complex waste storage tanks; and (2) to identify viable methods/criteria for demonstrating safety relative to deep sludge gas release events (DSGRE) in the near term to support the Hanford C-Farm retrieval mission. A secondary purpose is to identify viable methods/criteria for demonstrating safety relative to DSGREs in the longer term to support the mission to retrieve waste from the Hanford Tank Farms and deliver it to the Waste Treatment and Immobilization Plant (WTP). The potential DSGRE issue resulted in the declaration of a positive Unreviewed Safety Question (USQ). C-Farm retrievals are currently proceeding under a Justification for Continued Operation (JCO) that only allows tanks 241-AN-101 and 241-AN-106 sludge levels of 192 inches and 195 inches, respectively. C-Farm retrievals need deeper sludge levels (approximately 310 inches in 241-AN-101 and approximately 250 inches in 241-AN-106). This effort is to provide analytical data and justification to continue retrievals in a safe and efficient manner.« less

  2. Sign changes as a universal concept in first-passage-time calculations

    NASA Astrophysics Data System (ADS)

    Braun, Wilhelm; Thul, Rüdiger

    2017-01-01

    First-passage-time problems are ubiquitous across many fields of study, including transport processes in semiconductors and biological synapses, evolutionary game theory and percolation. Despite their prominence, first-passage-time calculations have proven to be particularly challenging. Analytical results to date have often been obtained under strong conditions, leaving most of the exploration of first-passage-time problems to direct numerical computations. Here we present an analytical approach that allows the derivation of first-passage-time distributions for the wide class of nondifferentiable Gaussian processes. We demonstrate that the concept of sign changes naturally generalizes the common practice of counting crossings to determine first-passage events. Our method works across a wide range of time-dependent boundaries and noise strengths, thus alleviating common hurdles in first-passage-time calculations.

  3. Sponsor relationships, analyte stability in ligand-binding assays and critical reagent management: a bioanalytical CRO perspective.

    PubMed

    Lefor Bradford, Julia

    2015-01-01

    This perspective article discusses key points to address in the establishment of sound partnerships between sponsors and bioanalytical CROs to assure the timeliness, quality and consistency of bioanalysis throughout biological therapeutic development. The performance of ligand-binding assays can be greatly impacted by low-grade reagents, lot-to-lot variability and lack of stability of the analyte in matrix, impacting both timelines and cost. Thorough characterization of the biologic of interest and its assay-enabling critical reagents will lend itself well to conservation of materials and continuity of assay performance. When unplanned events occur, such as performance declines or premature depletion of material, structured procedures are paramount to supplement the current loosely defined regulatory guidance on critical reagent characterization and method bridging.

  4. Temperature distribution of a simplified rotor due to a uniform heat source

    NASA Astrophysics Data System (ADS)

    Welzenbach, Sarah; Fischer, Tim; Meier, Felix; Werner, Ewald; kyzy, Sonun Ulan; Munz, Oliver

    2018-03-01

    In gas turbines, high combustion efficiency as well as operational safety are required. Thus, labyrinth seal systems with honeycomb liners are commonly used. In the case of rubbing events in the seal system, the components can be damaged due to cyclic thermal and mechanical loads. Temperature differences occurring at labyrinth seal fins during rubbing events can be determined by considering a single heat source acting periodically on the surface of a rotating cylinder. Existing literature analysing the temperature distribution on rotating cylindrical bodies due to a stationary heat source is reviewed. The temperature distribution on the circumference of a simplified labyrinth seal fin is calculated using an available and easy to implement analytical approach. A finite element model of the simplified labyrinth seal fin is created and the numerical results are compared to the analytical results. The temperature distributions calculated by the analytical and the numerical approaches coincide for low sliding velocities, while there are discrepancies of the calculated maximum temperatures for higher sliding velocities. The use of the analytical approach allows the conservative estimation of the maximum temperatures arising in labyrinth seal fins during rubbing events. At the same time, high calculation costs can be avoided.

  5. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    PubMed

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  7. Utility of NIST Whole-Genome Reference Materials for the Technical Validation of a Multigene Next-Generation Sequencing Test.

    PubMed

    Shum, Bennett O V; Henner, Ilya; Belluoccio, Daniele; Hinchcliffe, Marcus J

    2017-07-01

    The sensitivity and specificity of next-generation sequencing laboratory developed tests (LDTs) are typically determined by an analyte-specific approach. Analyte-specific validations use disease-specific controls to assess an LDT's ability to detect known pathogenic variants. Alternatively, a methods-based approach can be used for LDT technical validations. Methods-focused validations do not use disease-specific controls but use benchmark reference DNA that contains known variants (benign, variants of unknown significance, and pathogenic) to assess variant calling accuracy of a next-generation sequencing workflow. Recently, four whole-genome reference materials (RMs) from the National Institute of Standards and Technology (NIST) were released to standardize methods-based validations of next-generation sequencing panels across laboratories. We provide a practical method for using NIST RMs to validate multigene panels. We analyzed the utility of RMs in validating a novel newborn screening test that targets 70 genes, called NEO1. Despite the NIST RM variant truth set originating from multiple sequencing platforms, replicates, and library types, we discovered a 5.2% false-negative variant detection rate in the RM truth set genes that were assessed in our validation. We developed a strategy using complementary non-RM controls to demonstrate 99.6% sensitivity of the NEO1 test in detecting variants. Our findings have implications for laboratories or proficiency testing organizations using whole-genome NIST RMs for testing. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  8. Performance characteristics of an ion chromatographic method for the quantitation of citrate and phosphate in pharmaceutical solutions.

    PubMed

    Jenke, Dennis; Sadain, Salma; Nunez, Karen; Byrne, Frances

    2007-01-01

    The performance of an ion chromatographic method for measuring citrate and phosphate in pharmaceutical solutions is evaluated. Performance characteristics examined include accuracy, precision, specificity, response linearity, robustness, and the ability to meet system suitability criteria. In general, the method is found to be robust within reasonable deviations from its specified operating conditions. Analytical accuracy is typically 100 +/- 3%, and short-term precision is not more than 1.5% relative standard deviation. The instrument response is linear over a range of 50% to 150% of the standard preparation target concentrations (12 mg/L for phosphate and 20 mg/L for citrate), and the results obtained using a single-point standard versus a calibration curve are essentially equivalent. A small analytical bias is observed and ascribed to the relative purity of the differing salts, used as raw materials in tested finished products and as reference standards in the analytical method. The assay is specific in that no phosphate or citrate peaks are observed in a variety of method-related solutions and matrix blanks (with and without autoclaving). The assay with manual preparation of the eluents is sensitive to the composition of the eluent in the sense that the eluent must be effectively degassed and protected from CO(2) ingress during use. In order for the assay to perform effectively, extensive system equilibration and conditioning is required. However, a properly conditioned and equilibrated system can be used to test a number of samples via chromatographic runs that include many (> 50) injections.

  9. Non-axisymmetric local magnetostatic equilibrium

    DOE PAGES

    Candy, Jefferey M.; Belli, Emily A.

    2015-03-24

    In this study, we outline an approach to the problem of local equilibrium in non-axisymmetric configurations that adheres closely to Miller's original method for axisymmetric plasmas. Importantly, this method is novel in that it allows not only specification of 3D shape, but also explicit specification of the shear in the 3D shape. A spectrally-accurate method for solution of the resulting nonlinear partial differential equations is also developed. We verify the correctness of the spectral method, in the axisymmetric limit, through comparisons with an independent numerical solution. Some analytic results for the two-dimensional case are given, and the connection to Boozermore » coordinates is clarified.« less

  10. Analytical and experimental study of axisymmetric truncated plug nozzle flow fields

    NASA Technical Reports Server (NTRS)

    Muller, T. J.; Sule, W. P.; Fanning, A. E.; Giel, T. V.; Galanga, F. L.

    1972-01-01

    Experimental and analytical investigation of the flow field and base pressure of internal-external-expansion truncated plug nozzles are discussed. Experimental results for two axisymmetric, conical plug-cylindrical shroud, truncated plug nozzles are presented for both open and closed wake operations. These results include extensive optical and pressure data covering nozzle flow field and base pressure characteristics, diffuser effects, lip shock strength, Mach disc behaviour, and the recompression and reverse flow regions. Transonic experiments for a special planar transonic section are presented. An extension of the analytical method of Hall and Mueller to include the internal shock wave from the shroud exit is presented for closed wake operation. Results of this analysis include effects on the flow field and base pressure of ambient pressure ratio, nozzle geometry, and the ratio of specific heats. Static thrust is presented as a function of ambient pressure ratio and nozzle geometry. A new transonic solution method is also presented.

  11. Focused analyte spray emission apparatus and process for mass spectrometric analysis

    DOEpatents

    Roach, Patrick J [Kennewick, WA; Laskin, Julia [Richland, WA; Laskin, Alexander [Richland, WA

    2012-01-17

    An apparatus and process are disclosed that deliver an analyte deposited on a substrate to a mass spectrometer that provides for trace analysis of complex organic analytes. Analytes are probed using a small droplet of solvent that is formed at the junction between two capillaries. A supply capillary maintains the droplet of solvent on the substrate; a collection capillary collects analyte desorbed from the surface and emits analyte ions as a focused spray to the inlet of a mass spectrometer for analysis. The invention enables efficient separation of desorption and ionization events, providing enhanced control over transport and ionization of the analyte.

  12. Content analysis of 150 years of British periodicals.

    PubMed

    Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello

    2017-01-24

    Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora.

  13. Content analysis of 150 years of British periodicals

    PubMed Central

    Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello

    2017-01-01

    Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora. PMID:28069962

  14. Hoop conjecture for colliding black holes

    NASA Astrophysics Data System (ADS)

    Ida, Daisuke; Nakao, Ken-Ichi; Siino, Masaru; Hayward, Sean A.

    1998-12-01

    We study the collision of black holes in the Kastor-Traschen space-time, at present the only such analytic solution. We investigate the dynamics of the event horizon in the case of the collision of two equal black holes, using the ray-tracing method. We confirm that the event horizon has trouser topology and show that its set of past end points (where the horizon is nonsmooth) is a spacelike curve resembling a seam of trousers. We show that this seam has a finite length and argue that twice this length be taken to define the minimal circumference C of the event horizon. Comparing with the asymptotic mass M, we find the inequality C<4πM supposed by the hoop conjecture, with both sides being of similar order, C~4πM. This supports the hoop conjecture as a guide to general gravitational collapse, even in the extreme case of head-on black-hole collisions.

  15. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725

  16. A New Method for Assessing How Sensitivity and Specificity of Linkage Studies Affects Estimation

    PubMed Central

    Moore, Cecilia L.; Amin, Janaki; Gidding, Heather F.; Law, Matthew G.

    2014-01-01

    Background While the importance of record linkage is widely recognised, few studies have attempted to quantify how linkage errors may have impacted on their own findings and outcomes. Even where authors of linkage studies have attempted to estimate sensitivity and specificity based on subjects with known status, the effects of false negatives and positives on event rates and estimates of effect are not often described. Methods We present quantification of the effect of sensitivity and specificity of the linkage process on event rates and incidence, as well as the resultant effect on relative risks. Formulae to estimate the true number of events and estimated relative risk adjusted for given linkage sensitivity and specificity are then derived and applied to data from a prisoner mortality study. The implications of false positive and false negative matches are also discussed. Discussion Comparisons of the effect of sensitivity and specificity on incidence and relative risks indicate that it is more important for linkages to be highly specific than sensitive, particularly if true incidence rates are low. We would recommend that, where possible, some quantitative estimates of the sensitivity and specificity of the linkage process be performed, allowing the effect of these quantities on observed results to be assessed. PMID:25068293

  17. High-speed digital phonoscopy images analyzed by Nyquist plots

    NASA Astrophysics Data System (ADS)

    Yan, Yuling

    2012-02-01

    Vocal-fold vibration is a key dynamic event in voice production, and the vibratory characteristics of the vocal fold correlate closely with voice quality and health condition. Laryngeal imaging provides direct means to observe the vocal fold vibration; in the past, however, available modalities were either too slow or impractical to resolve the actual vocal fold vibrations. This limitation has now been overcome by high-speed digital imaging (HSDI) (or high-speed digital phonoscopy), which records images of the vibrating vocal folds at a rate of 2000 frames per second or higher- fast enough to resolve a specific, sustained phonatory vocal fold vibration. The subsequent image-based functional analysis of voice is essential to better understanding the mechanism underlying voice production, as well as assisting the clinical diagnosis of voice disorders. Our primary objective is to develop a comprehensive analytical platform for voice analysis using the HSDI recordings. So far, we have developed various analytical approaches for the HSDI-based voice analyses. These include Nyquist plots and associated analysese that are used along with FFT and Spectrogram in the analysis of the HSDI data representing normal voice and specific voice pathologies.

  18. DAPNe with micro-capillary separatory chemistry-coupled to MALDI-MS for the analysis of polar and non-polar lipid metabolism in one cell

    NASA Astrophysics Data System (ADS)

    Hamilton, Jason S.; Aguilar, Roberto; Petros, Robby A.; Verbeck, Guido F.

    2017-05-01

    The cellular metabolome is considered to be a representation of cellular phenotype and cellular response to changes to internal or external events. Methods to expand the coverage of the expansive physiochemical properties that makeup the metabolome currently utilize multi-step extractions and chromatographic separations prior to chemical detection, leading to lengthy analysis times. In this study, a single-step procedure for the extraction and separation of a sample using a micro-capillary as a separatory funnel to achieve analyte partitioning within an organic/aqueous immiscible solvent system is described. The separated analytes are then spotted for MALDI-MS imaging and distribution ratios are calculated. Initially, the method is applied to standard mixtures for proof of partitioning. The extraction of an individual cell is non-reproducible; therefore, a broad chemical analysis of metabolites is necessary and will be illustrated with the one-cell analysis of a single Snu-5 gastric cancer cell taken from a cellular suspension. The method presented here shows a broad partitioning dynamic range as a single-step method for lipid analysis demonstrating a decrease in ion suppression often present in MALDI analysis of lipids.

  19. Quantitative PCR for genetic markers of human fecal pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...

  20. Method to compute the stress-energy tensor for a quantum field outside a black hole that forms from collapse

    NASA Astrophysics Data System (ADS)

    Anderson, Paul; Evans, Charles

    2017-01-01

    A method to compute the stress-energy tensor for a quantized massless minimally coupled scalar field outside the event horizon of a 4-D black hole that forms from the collapse of a spherically symmetric null shell is given. The method is illustrated in the corresponding 2-D case which is mathematically similar but is simple enough that the calculations can be done analytically. The approach to the Unruh state at late times is discussed. National Science Foundation Grant No. PHY-1505875 to Wake Forest University and National Science Foundation Grant No. PHY-1506182 to the University of North Carolina, Chapel Hill

  1. Simultaneous gas chromatographic determination of chlorpyrifos and its impurity sulfotep in liquid pesticide formulations.

    PubMed

    Płonka, Marlena; Walorczyk, Stanisław; Miszczyk, Marek; Kronenbach-Dylong, Dorota

    2016-11-01

    An analytical method for simultaneous determination of the active substance (chlorpyrifos) and its relevant impurity (sulfotep) in commercial pesticide formulations has been developed and validated. The proposed method entails extraction of the analytes from samples by sonication with acetone and analysis by gas chromatography-flame ionization detection (GC-FID). The proposed method was characterized by satisfactory accuracy and precision. The repeatability expressed as relative standard deviation (RSD) was lower than the acceptable values calculated from the modified Horwitz equation whereas individual recoveries were in the range of 98-102% and 80-120% for chlorpyrifos and sulfotep, respectively. The limit of quantification (LOQ) for the impurity (sulfotep) was 0.003 mg mL(-1) corresponding to the maximum permitted level according to Food and Agricultural Organization of the United Nations (FAO) specifications for the active substance (chlorpyrifos) being 3 g kg(-1) of the chlorpyrifos content found. The main advantage of the proposed method was a considerable reduction in the analysis time since both analytes were determined based on a single injection into the GC-FID. Analysis of real samples of commercial pesticide formulations confirmed fitness-for-purpose of the proposed method.

  2. Communication: Exact analytical derivatives for the domain-based local pair natural orbital MP2 method (DLPNO-MP2)

    NASA Astrophysics Data System (ADS)

    Pinski, Peter; Neese, Frank

    2018-01-01

    Electron correlation methods based on pair natural orbitals (PNOs) have gained an increasing degree of interest in recent years, as they permit energy calculations to be performed on systems containing up to many hundred atoms, while maintaining chemical accuracy for reaction energies. We present an approach for taking exact analytical first derivatives of the energy contributions in the simplest method of the family of Domain-based Local Pair Natural Orbital (DLPNO) methods, closed-shell DLPNO-MP2. The Lagrangian function contains constraints to account for the relaxation of PNOs. RI-MP2 reference geometries are reproduced accurately, as exemplified for four systems with a substantial degree of nonbonding interactions. By the example of electric field gradients, we demonstrate that omitting PNO-specific constraints can lead to dramatic errors for orbital-relaxed properties.

  3. Fast analytical scatter estimation using graphics processing units.

    PubMed

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  4. Beam Energy Scan of Specific Heat Through Temperature Fluctuations in Heavy Ion Collisions

    NASA Astrophysics Data System (ADS)

    Basu, Sumit; Nandi, Basanta K.; Chatterjee, Sandeep; Chatterjee, Rupa; Nayak, Tapan

    2016-01-01

    Temperature fluctuations may have two distinct origins, first, quantum fluctuations that are initial state fluctuations, and second, thermodynamical fluctuations. We discuss a method of extracting the thermodynamic temperature from the mean transverse momentum of pions, by using controllable parameters such as centrality of the system, and range of the transverse momenta. Event-by-event fluctuations in global temperature over a large phase space provide the specific heat of the system. We present Beam Energy Scan of specific heat from data, AMPT and HRG model prediction. Experimental results from NA49, STAR, PHENIX, PHOBOS and ALICE are combined to obtain the specific heat as a function of beam energy. These results are compared to calculations from AMPT event generator, HRG model and lattice calculations, respectively.

  5. Evaluation of real-time PCR detection methods for detecting rice products contaminated by rice genetically modified with a CpTI-KDEL-T-nos transgenic construct.

    PubMed

    Nakamura, Kosuke; Akiyama, Hiroshi; Kawano, Noriaki; Kobayashi, Tomoko; Yoshimatsu, Kayo; Mano, Junichi; Kitta, Kazumi; Ohmori, Kiyomi; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko

    2013-12-01

    Genetically modified (GM) rice (Oryza sativa) lines, such as insecticidal Kefeng and Kemingdao, have been developed and found unauthorised in processed rice products in many countries. Therefore, qualitative detection methods for the GM rice are required for the GM food regulation. A transgenic construct for expressing cowpea (Vigna unguiculata) trypsin inhibitor (CpTI) was detected in some imported processed rice products contaminated with Kemingdao. The 3' terminal sequence of the identified transgenic construct for expression of CpTI included an endoplasmic reticulum retention signal coding sequence (KDEL) and nopaline synthase terminator (T-nos). The sequence was identical to that in a report on Kefeng. A novel construct-specific real-time polymerase chain reaction (PCR) detection method for detecting the junction region sequence between the CpTI-KDEL and T-nos was developed. The imported processed rice products were evaluated for the contamination of the GM rice using the developed construct-specific real-time PCR methods, and detection frequency was compared with five event-specific detection methods. The construct-specific detection methods detected the GM rice at higher frequency than the event-specific detection methods. Therefore, we propose that the construct-specific detection method is a beneficial tool for screening the contamination of GM rice lines, such as Kefeng, in processed rice products for the GM food regulation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Inorganic Arsenic Determination in Food: A Review of Analytical Proposals and Quality Assessment Over the Last Six Years.

    PubMed

    Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín

    2017-01-01

    Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.

  8. Reflecting Solutions of High Order Elliptic Differential Equations in Two Independent Variables Across Analytic Arcs. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Carleton, O.

    1972-01-01

    Consideration is given specifically to sixth order elliptic partial differential equations in two independent real variables x, y such that the coefficients of the highest order terms are real constants. It is assumed that the differential operator has distinct characteristics and that it can be factored as a product of second order operators. By analytically continuing into the complex domain and using the complex characteristic coordinates of the differential equation, it is shown that its solutions, u, may be reflected across analytic arcs on which u satisfies certain analytic boundary conditions. Moreover, a method is given whereby one can determine a region into which the solution is extensible. It is seen that this region of reflection is dependent on the original domain of difinition of the solution, the arc and the coefficients of the highest order terms of the equation and not on any sufficiently small quantities; i.e., the reflection is global in nature. The method employed may be applied to similar differential equations of order 2n.

  9. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  10. Projecting adverse event incidence rates using empirical Bayes methodology.

    PubMed

    Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing

    2016-08-01

    Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.

  11. A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications.

    PubMed

    Hayashi, Ryusuke; Watanabe, Osamu; Yokoyama, Hiroki; Nishida, Shin'ya

    2017-06-01

    Characterization of the functional relationship between sensory inputs and neuronal or observers' perceptual responses is one of the fundamental goals of systems neuroscience and psychophysics. Conventional methods, such as reverse correlation and spike-triggered data analyses are limited in their ability to resolve complex and inherently nonlinear neuronal/perceptual processes because these methods require input stimuli to be Gaussian with a zero mean. Recent studies have shown that analyses based on a generalized linear model (GLM) do not require such specific input characteristics and have advantages over conventional methods. GLM, however, relies on iterative optimization algorithms and its calculation costs become very expensive when estimating the nonlinear parameters of a large-scale system using large volumes of data. In this paper, we introduce a new analytical method for identifying a nonlinear system without relying on iterative calculations and yet also not requiring any specific stimulus distribution. We demonstrate the results of numerical simulations, showing that our noniterative method is as accurate as GLM in estimating nonlinear parameters in many cases and outperforms conventional, spike-triggered data analyses. As an example of the application of our method to actual psychophysical data, we investigated how different spatiotemporal frequency channels interact in assessments of motion direction. The nonlinear interaction estimated by our method was consistent with findings from previous vision studies and supports the validity of our method for nonlinear system identification.

  12. Annual Conference on HAN-Based Liquid Propellants. Volume 1

    DTIC Science & Technology

    1989-05-01

    Fischer . This situation is obviously not ideal and effort is being made to find a suitable method . However we have been assured that there has been...CLASSIFICATION OF HAN-BASED LIQUID PROPELLANT LP101. S. Westlake --..---- ------------ 64 POSSIBLE TEST METHODS TO STUDY THE THERMAL STABILITY OF...specifications for LP. The phase of the program which is now in progress has dealt with (1) reviewing. recommending and developing applicable analytical methods

  13. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  14. Characterization of spacecraft humidity condensate

    NASA Technical Reports Server (NTRS)

    Muckle, Susan; Schultz, John R.; Sauer, Richard L.

    1994-01-01

    When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.

  15. On analyticity of linear waves scattered by a layered medium

    NASA Astrophysics Data System (ADS)

    Nicholls, David P.

    2017-10-01

    The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.

  16. Analytical solutions for two-dimensional Stokes flow singularities in a no-slip wedge of arbitrary angle

    PubMed Central

    Brzezicki, Samuel J.

    2017-01-01

    An analytical method to find the flow generated by the basic singularities of Stokes flow in a wedge of arbitrary angle is presented. Specifically, we solve a biharmonic equation for the stream function of the flow generated by a point stresslet singularity and satisfying no-slip boundary conditions on the two walls of the wedge. The method, which is readily adapted to any other singularity type, takes full account of any transcendental singularities arising at the corner of the wedge. The approach is also applicable to problems of plane strain/stress of an elastic solid where the biharmonic equation also governs the Airy stress function. PMID:28690412

  17. Analytical solutions for two-dimensional Stokes flow singularities in a no-slip wedge of arbitrary angle.

    PubMed

    Crowdy, Darren G; Brzezicki, Samuel J

    2017-06-01

    An analytical method to find the flow generated by the basic singularities of Stokes flow in a wedge of arbitrary angle is presented. Specifically, we solve a biharmonic equation for the stream function of the flow generated by a point stresslet singularity and satisfying no-slip boundary conditions on the two walls of the wedge. The method, which is readily adapted to any other singularity type, takes full account of any transcendental singularities arising at the corner of the wedge. The approach is also applicable to problems of plane strain/stress of an elastic solid where the biharmonic equation also governs the Airy stress function.

  18. On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.

    2003-01-01

    A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.

  19. Original analytic solution of a half-bridge modelled as a statically indeterminate system

    NASA Astrophysics Data System (ADS)

    Oanta, Emil M.; Panait, Cornel; Raicu, Alexandra; Barhalescu, Mihaela

    2016-12-01

    The paper presents an original computer based analytical model of a half-bridge belonging to a circular settling tank. The primary unknown is computed using the force method, the coefficients of the canonical equation being calculated using either the discretization of the bending moment diagram in trapezoids, or using the relations specific to the polygons. A second algorithm based on the method of initial parameters is also presented. Analyzing the new solution we came to the conclusion that most of the computer code developed for other model may be reused. The results are useful to evaluate the behavior of the structure and to compare with the results of the finite element models.

  20. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

Top