Sample records for permits quantitative analysis

  1. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  2. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  3. Quantitative Verse in a Quantity-Insensitive Language: Baif's "vers mesures."

    ERIC Educational Resources Information Center

    Bullock, Barbara E.

    1997-01-01

    Analysis of the quantitative metrical verse of French Renaissance poet Jean-Antoine de Baif finds that the metrics, often seen as unscannable and using an incomprehensible phonetic orthography, derive largely from a system that is accentual, with the orthography permitting the poet to encode quantitative distinctions that coincide with the meter.…

  4. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  5. Quantitative Analysis of Urine Vapor and Breath by Gas-Liquid Partition Chromatography

    PubMed Central

    Pauling, Linus; Robinson, Arthur B.; Teranishi, Roy; Cary, Paul

    1971-01-01

    When a human being is placed for several days on a completely defined diet, consisting almost entirely of small molecules that are absorbed from the stomach into the blood, intestinal flora disappear because of lack of nutrition. By this technique, the composition of body fluids can be made constant (standard deviation about 10%) after a few days, permitting significant quantitative analyses to be performed. A method of temperature-programmed gas-liquid partition chromatography has been developed for this purpose. It permits the quantitative determination of about 250 substances in a sample of breath, and of about 280 substances in a sample of urine vapor. The technique should be useful in the application of the principles of orthomolecular medicine. PMID:5289873

  6. Quantitative analysis of urine vapor and breath by gas-liquid partition chromatography.

    PubMed

    Pauling, L; Robinson, A B; Teranishi, R; Cary, P

    1971-10-01

    When a human being is placed for several days on a completely defined diet, consisting almost entirely of small molecules that are absorbed from the stomach into the blood, intestinal flora disappear because of lack of nutrition. By this technique, the composition of body fluids can be made constant (standard deviation about 10%) after a few days, permitting significant quantitative analyses to be performed. A method of temperature-programmed gas-liquid partition chromatography has been developed for this purpose. It permits the quantitative determination of about 250 substances in a sample of breath, and of about 280 substances in a sample of urine vapor. The technique should be useful in the application of the principles of orthomolecular medicine.

  7. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  8. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  9. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  10. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  11. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  12. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  13. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  14. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  15. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  16. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  17. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  18. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of each...

  19. 3D Material Response Analysis of PICA Pyrolysis Experiments

    NASA Technical Reports Server (NTRS)

    Oliver, A. Brandon

    2017-01-01

    The PICA decomposition experiments of Bessire and Minton are investigated using 3D material response analysis. The steady thermoelectric equations have been added to the CHAR code to enable analysis of the Joule-heated experiments and the DAKOTA optimization code is used to define the voltage boundary condition that yields the experimentally observed temperature response. This analysis has identified a potential spatial non-uniformity in the PICA sample temperature driven by the cooled copper electrodes and thermal radiation from the surface of the test article (Figure 1). The non-uniformity leads to a variable heating rate throughout the sample volume that has an effect on the quantitative results of the experiment. Averaging the results of integrating a kinetic reaction mechanism with the heating rates seen across the sample volume yield a shift of peak species production to lower temperatures that is more significant for higher heating rates (Figure 2) when compared to integrating the same mechanism at the reported heating rate. The analysis supporting these conclusions will be presented along with a proposed analysis procedure that permits quantitative use of the existing data. Time permitting, a status on the in-development kinetic decomposition mechanism based on this data will be presented as well.

  20. Forces on intraocular lens haptics induced by capsular fibrosis. An experimental study.

    PubMed

    Guthoff, R; Abramo, F; Draeger, J; Chumbley, L C; Lang, G K; Neumann, W

    1990-01-01

    Electronic dynamometry measurements, performed upon intraocular lens (IOL) haptics of prototype one-piece three-loop silicone lenses, accurately defined the relationships between elastic force and haptic displacement. Lens implantations in the capsular bag of dogs (loop span equal to capsular bag diameter, loops underformed immediately after the operation) were evaluated macrophotographically 5-8 months postoperatively. The highly constant elastic property of silicon rubber permitted quantitative correlation of subsequent in vivo haptic displacement with the resultant force vectors responsible for tissue contraction. The lens optics were well centered in 17 (85%) and slightly offcenter in 3 (15%) of 20 implanted eyes. Of the 60 supporting loops, 28 could be visualized sufficiently well to permit reliable haptic measurement. Of these 28, 20 (71%) were clearly displaced, ranging from 0.45 mm away from to 1.4 mm towards the lens' optic center. These extremes represented resultant vector forces of 0.20 and 1.23 mN respectively. Quantitative vector analysis permits better understanding of IOL-capsular interactions.

  1. A Quantitative Study of Oxygen as a Metabolic Regulator

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; LaManna, Joseph C.; Cabera, Marco E.

    2000-01-01

    An acute reduction in oxygen delivery to a tissue is associated with metabolic changes aimed at maintaining ATP homeostasis. However, given the complexity of the human bio-energetic system, it is difficult to determine quantitatively how cellular metabolic processes interact to maintain ATP homeostasis during stress (e.g., hypoxia, ischemia, and exercise). In particular, we are interested in determining mechanisms relating cellular oxygen concentration to observed metabolic responses at the cellular, tissue, organ, and whole body levels and in quantifying how changes in tissue oxygen availability affect the pathways of ATP synthesis and the metabolites that control these pathways. In this study; we extend a previously developed mathematical model of human bioenergetics, to provide a physicochemical framework that permits quantitative understanding of oxygen as a metabolic regulator. Specifically, the enhancement - sensitivity analysis - permits studying the effects of variations in tissue oxygenation and parameters controlling cellular respiration on glycolysis, lactate production, and pyruvate oxidation. The analysis can distinguish between parameters that must be determined accurately and those that require less precision, based on their effects on model predictions. This capability may prove to be important in optimizing experimental design, thus reducing use of animals.

  2. Targeting Neuronal-like Metabolism of Metastatic Tumor Cells as a Novel Therapy for Breast Cancer Brain Metastasis

    DTIC Science & Technology

    2017-03-01

    Contribution to Project: Ian primarily focuses on developing tissue imaging pipeline and perform imaging data analysis . Funding Support: Partially...3D ReconsTruction), a multi-faceted image analysis pipeline , permitting quantitative interrogation of functional implications of heterogeneous... analysis pipeline , to observe and quantify phenotypic metastatic landscape heterogeneity in situ with spatial and molecular resolution. Our implementation

  3. Removal of uranium from soil sample digests for ICP-OES analysis of trace metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foust, R.D. Jr.; Bidabad, M.

    1996-10-01

    An analytical procedure has been developed to quantitatively remove uranium from soil sample digests, permitting ICP-OES analysis of trace metals. The procedure involves digesting a soil sample with standard procedures (EPA SW-846, Method 3050), and passing the sample digestate through commercially available resin (U/TEVA{sm_bullet}Spec, Eichrom Industries, Inc.) containing diarryl amylphosphonate as the stationary phase. Quantitative removal of uranium was achieved with soil samples containing up to 60% uranium, and percent recoveries averaged better than 85% for 9 of the 10 metals evaluated (Ag, As, Cd. Cr, Cu, Ni, Pb, Se and Tl). The U/TEVA{sm_bullet}Spec column was regenerated by washing withmore » 200 mL of a 0.01 M oxalic acid/0.02 M nitric acid solution, permitting re-use of the column. GFAAS analysis of a sample spiked with 56.5% uranium, after treatment of the digestate with a U/TEVA{sm_bullet}Spec resin column, resulted in percent recoveries of 97% or better for all target metals.« less

  4. Microscopes and computers combined for analysis of chromosomes

    NASA Technical Reports Server (NTRS)

    Butler, J. W.; Butler, M. K.; Stroud, A. N.

    1969-01-01

    Scanning machine CHLOE, developed for photographic use, is combined with a digital computer to obtain quantitative and statistically significant data on chromosome shapes, distribution, density, and pairing. CHLOE permits data acquisition about a chromosome complement to be obtained two times faster than by manual pairing.

  5. 19 CFR 132.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... indicated: (a) Absolute (or quantitative) quotas. “Absolute (or quantitative) quotas” are those which permit... for consumption of merchandise subject to quota are permitted. Some absolute quotas limit the entry or... absolute or a tariff-rate quota. (f) Quota priority. “Quota priority” is the precedence granted to one...

  6. 19 CFR 132.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... indicated: (a) Absolute (or quantitative) quotas. “Absolute (or quantitative) quotas” are those which permit... for consumption of merchandise subject to quota are permitted. Some absolute quotas limit the entry or... absolute or a tariff-rate quota. (f) Quota priority. “Quota priority” is the precedence granted to one...

  7. Chlorotrimethylsilane, a reagent for the direct quantitative analysis of fats and oils present in vegetable and meat samples.

    PubMed

    Eras, Jordi; Ferran, Javier; Perpiña, Belén; Canela, Ramon

    2004-08-20

    Acylglycerides present in oil seeds and meat can be transformed into volatile fatty esters using chlorotrimethylsilane (CTMS) and 1-pentanol as reagents. The volatile esters can then be analysed by GC. The method is quantitative and involves only minor sample manipulation. It often permits major recoveries of the total saponifiable lipids present in solid samples. A 40 min reaction time is enough to ensure the total conversion of saponifiable lipids to the corresponding FAPEs.

  8. A color prediction model for imagery analysis

    NASA Technical Reports Server (NTRS)

    Skaley, J. E.; Fisher, J. R.; Hardy, E. E.

    1977-01-01

    A simple model has been devised to selectively construct several points within a scene using multispectral imagery. The model correlates black-and-white density values to color components of diazo film so as to maximize the color contrast of two or three points per composite. The CIE (Commission Internationale de l'Eclairage) color coordinate system is used as a quantitative reference to locate these points in color space. Superimposed on this quantitative reference is a perceptional framework which functionally contrasts color values in a psychophysical sense. This methodology permits a more quantitative approach to the manual interpretation of multispectral imagery while resulting in improved accuracy and lower costs.

  9. Development of quantitative exposure data for a pooled exposure-response analysis of 10 silica cohorts.

    PubMed

    Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa

    2002-08-01

    Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.

  10. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... under the Marine Protection Research and Sanctuaries Act. (viii) Dredge or fill permits under section... “quantitative data” for a pollutant are required, the applicant must collect a sample of effluent and analyze it... and report that quantitative data as applying to the substantially identical outfall. The requirements...

  11. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  12. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  13. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  14. 40 CFR 122.21 - Application for a permit (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... under the Clean Air Act. (vii) Ocean dumping permits under the Marine Protection Research and... is to be provided as specified in § 122.26). When “quantitative data” for a pollutant are required... Director may allow the applicant to test only one outfall and report that quantitative data as applying to...

  15. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    PubMed

    Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E

    2014-01-01

    Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  16. Multiplex, quantitative cellular analysis in large tissue volumes with clearing-enhanced 3D microscopy (Ce3D)

    PubMed Central

    Li, Weizhe; Germain, Ronald N.

    2017-01-01

    Organ homeostasis, cellular differentiation, signal relay, and in situ function all depend on the spatial organization of cells in complex tissues. For this reason, comprehensive, high-resolution mapping of cell positioning, phenotypic identity, and functional state in the context of macroscale tissue structure is critical to a deeper understanding of diverse biological processes. Here we report an easy to use method, clearing-enhanced 3D (Ce3D), which generates excellent tissue transparency for most organs, preserves cellular morphology and protein fluorescence, and is robustly compatible with antibody-based immunolabeling. This enhanced signal quality and capacity for extensive probe multiplexing permits quantitative analysis of distinct, highly intermixed cell populations in intact Ce3D-treated tissues via 3D histo-cytometry. We use this technology to demonstrate large-volume, high-resolution microscopy of diverse cell types in lymphoid and nonlymphoid organs, as well as to perform quantitative analysis of the composition and tissue distribution of multiple cell populations in lymphoid tissues. Combined with histo-cytometry, Ce3D provides a comprehensive strategy for volumetric quantitative imaging and analysis that bridges the gap between conventional section imaging and disassociation-based techniques. PMID:28808033

  17. Quantitative imaging of aggregated emulsions.

    PubMed

    Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J

    2006-02-28

    Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.

  18. Illuminator, a desktop program for mutation detection using short-read clonal sequencing.

    PubMed

    Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T

    2011-10-01

    Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. 76 FR 72434 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... following applicants each request a permit to import the sport- hunted trophy of one male bontebok...

  20. 75 FR 63196 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... a permit to import a sport-hunted trophy of one male bontebok (Damaliscus pygargus pygargus) culled...

  1. 78 FR 62647 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... permit to import the sport- hunted trophy of one male bontebok (Damaliscus pygargus pygargus) culled from...

  2. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  3. Analysis of fecal coliform levels at selected storm water monitoring points at the Oak Ridge Y-12 Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skaggs, B.E.

    1995-07-01

    The Environmental Protection Agency staff published the final storm water regulation on November 16, 1990. The storm water regulation is included in the National Pollutant Discharge Elimination System (NPDES) regulations. It specifies the permit application requirements for certain storm water discharges such as industrial activity or municipal separate storm sewers serving populations of 100,000 or greater. Storm water discharge associated with industrial activity is discharge from any conveyance used for collecting and conveying storm water that is directly related to manufacturing, processing, or raw material storage areas at an industrial plant. Quantitative testing data is required for these discharges. Anmore » individual storm water permit application was completed and submitted to Tennessee Department of Environment and Conservation (TDEC) personnel in October 1992. After reviewing this data in the permit application, TDEC personnel expressed concern with the fecal coliform levels at many of the outfalls. The 1995 NPDES Permit (Part 111-N, page 44) requires that an investigation be conducted to determine the validity of this data. If the fecal coliform data is valid, the permit requires that a report be submitted indicating possible causes and proposed corrective actions.« less

  4. 76 FR 65207 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-20

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... applicants each request a permit to import the sport- hunted trophy of one male bontebok (Damaliscus pygargus...

  5. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  6. A general method for bead-enhanced quantitation by flow cytometry

    PubMed Central

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  7. Three-dimensional cardiac architecture determined by two-photon microtomy

    NASA Astrophysics Data System (ADS)

    Huang, Hayden; MacGillivray, Catherine; Kwon, Hyuk-Sang; Lammerding, Jan; Robbins, Jeffrey; Lee, Richard T.; So, Peter

    2009-07-01

    Cardiac architecture is inherently three-dimensional, yet most characterizations rely on two-dimensional histological slices or dissociated cells, which remove the native geometry of the heart. We previously developed a method for labeling intact heart sections without dissociation and imaging large volumes while preserving their three-dimensional structure. We further refine this method to permit quantitative analysis of imaged sections. After data acquisition, these sections are assembled using image-processing tools, and qualitative and quantitative information is extracted. By examining the reconstructed cardiac blocks, one can observe end-to-end adjacent cardiac myocytes (cardiac strands) changing cross-sectional geometries, merging and separating from other strands. Quantitatively, representative cross-sectional areas typically used for determining hypertrophy omit the three-dimensional component; we show that taking orientation into account can significantly alter the analysis. Using fast-Fourier transform analysis, we analyze the gross organization of cardiac strands in three dimensions. By characterizing cardiac structure in three dimensions, we are able to determine that the α crystallin mutation leads to hypertrophy with cross-sectional area increases, but not necessarily via changes in fiber orientation distribution.

  8. Evolution in Cloud Population Statistics of the MJO: From AMIE Field Observations to Global-Cloud Permitting Models Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollias, Pavlos

    This is a multi-institutional, collaborative project using a three-tier modeling approach to bridge field observations and global cloud-permitting models, with emphases on cloud population structural evolution through various large-scale environments. Our contribution was in data analysis for the generation of high value cloud and precipitation products and derive cloud statistics for model validation. There are two areas in data analysis that we contributed: the development of a synergistic cloud and precipitation cloud classification that identify different cloud (e.g. shallow cumulus, cirrus) and precipitation types (shallow, deep, convective, stratiform) using profiling ARM observations and the development of a quantitative precipitation ratemore » retrieval algorithm using profiling ARM observations. Similar efforts have been developed in the past for precipitation (weather radars), but not for the millimeter-wavelength (cloud) radar deployed at the ARM sites.« less

  9. Fluorescence excitation-emission matrix (EEM) spectroscopy and cavity ring-down (CRD) absorption spectroscopy of oil-contaminated jet fuel using fiber-optic probes.

    PubMed

    Omrani, Hengameh; Barnes, Jack A; Dudelzak, Alexander E; Loock, Hans-Peter; Waechter, Helen

    2012-06-21

    Excitation emission matrix (EEM) and cavity ring-down (CRD) spectral signatures have been used to detect and quantitatively assess contamination of jet fuels with aero-turbine lubricating oil. The EEM spectrometer has been fiber-coupled to permit in situ measurements of jet turbine oil contamination of jet fuel. Parallel Factor (PARAFAC) analysis as well as Principal Component Analysis and Regression (PCA/PCR) were used to quantify oil contamination in a range from the limit of detection (10 ppm) to 1000 ppm. Fiber-loop cavity ring-down spectroscopy using a pulsed 355 nm laser was used to quantify the oil contamination in the range of 400 ppm to 100,000 ppm. Both methods in combination therefore permit the detection of oil contamination with a linear dynamic range of about 10,000.

  10. Fast Metabolic Response to Drug Intervention through Analysis on a Miniaturized, Highly Integrated Molecular Imaging System

    PubMed Central

    Wang, Jun; Hwang, Kiwook; Braas, Daniel; Dooraghi, Alex; Nathanson, David; Campbell, Dean O.; Gu, Yuchao; Sandberg, Troy; Mischel, Paul; Radu, Caius; Chatziioannou, Arion F.; Phelps, Michael E.; Christofk, Heather; Heath, James R.

    2014-01-01

    We report on a radiopharmaceutical imaging platform designed to capture the kinetics of cellular responses to drugs. Methods A portable in vitro molecular imaging system, comprised of a microchip and a beta-particle imaging camera, permits routine cell-based radioassays on small number of either suspension or adherent cells. We investigate the response kinetics of model lymphoma and glioblastoma cancer cell lines to [18F]fluorodeoxyglucose ([18F]FDG) uptake following drug exposure. Those responses are correlated with kinetic changes in the cell cycle, or with changes in receptor-tyrosine kinase signaling. Results The platform enables radioassays directly on multiple cell types, and yields results comparable to conventional approaches, but uses smaller sample sizes, permits a higher level of quantitation, and doesn’t require cell lysis. Conclusion The kinetic analysis enabled by the platform provides a rapid (~1 hour) drug screening assay. PMID:23978446

  11. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  12. The embryonic mouse hindbrain as a qualitative and quantitative model for studying the molecular and cellular mechanisms of angiogenesis.

    PubMed

    Fantin, Alessandro; Vieira, Joaquim M; Plein, Alice; Maden, Charlotte H; Ruhrberg, Christiana

    2013-02-01

    The mouse embryo hindbrain is a robust and adaptable model for studying sprouting angiogenesis. It permits the spatiotemporal analysis of organ vascularization in normal mice and in mouse strains with genetic mutations that result in late embryonic or perinatal lethality. Unlike postnatal models such as retinal angiogenesis or Matrigel implants, there is no requirement for the breeding of conditional knockout mice. The unique architecture of the hindbrain vasculature allows whole-mount immunolabeling of blood vessels and high-resolution imaging, as well as easy quantification of angiogenic sprouting, network density and vessel caliber. The hindbrain model also permits the visualization of ligand binding to blood vessels in situ and the analysis of blood vessel growth within a natural multicellular microenvironment in which endothelial cells (ECs) interact with non-ECs to refine the 3D organ architecture. The entire procedure, from embryo isolation to imaging and through to results analysis, takes approximately 4 d.

  13. On the response of subduction in the South Pacific to an intensification of westerlies and heat flux in an eddy permitting ocean model

    NASA Astrophysics Data System (ADS)

    Liu, Chengyan; Wang, Zhaomin; Li, Bingrui; Cheng, Chen; Xia, Ruibin

    2017-04-01

    Based on an eddy permitting ocean general circulation model, the response of water masses to two distinct climate scenarios in the South Pacific is assessed in this paper. Under annually repeating atmospheric forcing that is characterized by different westerlies and associated heat flux, the response of Subantarctic Mode Water (SAMW) and Antarctic Intermediate Water (AAIW) is quantitatively estimated. Both SAMW and AAIW are found to be warmer, saltier and denser under intensified westerlies and increased heat loss. The increase in the subduction volume of SAMW and AAIW is about 19.8 Sv (1 Sv = 106 m3 s-1). The lateral induction term plays a dominant role in the changes in the subduction volume due to the deepening of the mixed layer depth (MLD). Furthermore, analysis of the buoyancy budget is used to quantitatively diagnose the reason for the changes in the MLD. The deepening of the MLD is found to be primarily caused by the strengthening of heat loss from the ocean to the atmosphere in the formation region of SAMW and AAIW.

  14. Quantitative analysis of amygdalin and prunasin in Prunus serotina Ehrh. using (1) H-NMR spectroscopy.

    PubMed

    Santos Pimenta, Lúcia P; Schilthuizen, Menno; Verpoorte, Robert; Choi, Young Hae

    2014-01-01

    Prunus serotina is native to North America but has been invasively introduced in Europe since the seventeenth century. This plant contains cyanogenic glycosides that are believed to be related to its success as an invasive plant. For these compounds, chromatographic- or spectrometric-based (targeting on HCN hydrolysis) methods of analysis have been employed so far. However, the conventional methods require tedious preparation steps and a long measuring time. To develop a fast and simple method to quantify the cyanogenic glycosides, amygdalin and prunasin in dried Prunus serotina leaves without any pre-purification steps using (1) H-NMR spectroscopy. Extracts of Prunus serotina leaves using CH3 OH-d4 and KH2 PO4 buffer in D2 O (1:1) were quantitatively analysed for amygdalin and prunasin using (1) H-NMR spectroscopy. Different internal standards were evaluated for accuracy and stability. The purity of quantitated (1) H-NMR signals was evaluated using several two-dimensional NMR experiments. Trimethylsilylpropionic acid sodium salt-d4 proved most suitable as the internal standard for quantitative (1) H-NMR analysis. Two-dimensional J-resolved NMR was shown to be a useful tool to confirm the structures and to check for possible signal overlapping with the target signals for the quantitation. Twenty-two samples of P. serotina were subsequently quantitatively analysed for the cyanogenic glycosides prunasin and amygdalin. The NMR method offers a fast, high-throughput analysis of cyanogenic glycosides in dried leaves permitting simultaneous quantification and identification of prunasin and amygdalin in Prunus serotina. Copyright © 2013 John Wiley & Sons, Ltd.

  15. A database system to support image algorithm evaluation

    NASA Technical Reports Server (NTRS)

    Lien, Y. E.

    1977-01-01

    The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.

  16. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  17. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  18. Corneal topography with high-speed swept source OCT in clinical examination

    PubMed Central

    Karnowski, Karol; Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Gora, Michalina; Wojtkowski, Maciej

    2011-01-01

    We present the applicability of high-speed swept source (SS) optical coherence tomography (OCT) for quantitative evaluation of the corneal topography. A high-speed OCT device of 108,000 lines/s permits dense 3D imaging of the anterior segment within a time period of less than one fourth of second, minimizing the influence of motion artifacts on final images and topographic analysis. The swept laser performance was specially adapted to meet imaging depth requirements. For the first time to our knowledge the results of a quantitative corneal analysis based on SS OCT for clinical pathologies such as keratoconus, a cornea with superficial postinfectious scar, and a cornea 5 months after penetrating keratoplasty are presented. Additionally, a comparison with widely used commercial systems, a Placido-based topographer and a Scheimpflug imaging-based topographer, is demonstrated. PMID:21991558

  19. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  20. Tracking Drug-induced Changes in Receptor Post-internalization Trafficking by Colocalizational Analysis.

    PubMed

    Ong, Edmund; Cahill, Catherine

    2015-07-03

    The intracellular trafficking of receptors is a collection of complex and highly controlled processes. Receptor trafficking modulates signaling and overall cell responsiveness to ligands and is, itself, influenced by intra- and extracellular conditions, including ligand-induced signaling. Optimized for use with monolayer-plated cultured cells, but extendable to free-floating tissue slices, this protocol uses immunolabelling and colocalizational analysis to track changes in intracellular receptor trafficking following both chronic/prolonged and acute interventions, including exogenous drug treatment. After drug treatment, cells are double-immunolabelled for the receptor and for markers for the intracellular compartments of interest. Sequential confocal microscopy is then used to capture two-channel photomicrographs of individual cells, which are subjected to computerized colocalizational analysis to yield quantitative colocalization scores. These scores are normalized to permit pooling of independent replicates prior to statistical analysis. Representative photomicrographs may also be processed to generate illustrative figures. Here, we describe a powerful and flexible technique for quantitatively assessing induced receptor trafficking.

  1. 33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...

  2. 33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...

  3. 33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...

  4. 21 CFR 170.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... quantitative determination of the amount of each food additive present or unless it is shown that a higher... methods that permit quantitative determination of each residue, the quantity of combined residues that are...

  5. 33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...

  6. 33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...

  7. Gene Profiling Technique to Accelerate Stem Cell Therapies for Eye Diseases

    MedlinePlus

    ... like RPE. They also use a technique called quantitative RT-PCR to measure the expression of genes ... higher in iPS cells than mature RPE. But quantitative RT-PCR only permits the simultaneous measurement of ...

  8. Determining conduction patterns on a sparse electrode grid: Implications for the analysis of clinical arrhythmias

    NASA Astrophysics Data System (ADS)

    Vidmar, David; Narayan, Sanjiv M.; Krummen, David E.; Rappel, Wouter-Jan

    2016-11-01

    We present a general method of utilizing bioelectric recordings from a spatially sparse electrode grid to compute a dynamic vector field describing the underlying propagation of electrical activity. This vector field, termed the wave-front flow field, permits quantitative analysis of the magnitude of rotational activity (vorticity) and focal activity (divergence) at each spatial point. We apply this method to signals recorded during arrhythmias in human atria and ventricles using a multipolar contact catheter and show that the flow fields correlate with corresponding activation maps. Further, regions of elevated vorticity and divergence correspond to sites identified as clinically significant rotors and focal sources where therapeutic intervention can be effective. These flow fields can provide quantitative insights into the dynamics of normal and abnormal conduction in humans and could potentially be used to enhance therapies for cardiac arrhythmias.

  9. A method for the quantitative site-specific study of the biochemistry within dental plaque biofilms formed in vivo.

    PubMed

    Robinson, C; Kirkham, J; Percival, R; Shore, R C; Bonass, W A; Brookes, S J; Kusa, L; Nakagaki, H; Kato, K; Nattress, B

    1997-01-01

    The study of plaque biofilms in the oral cavity is difficult as plaque removal inevitably disrupts biofilm integrity precluding kinetic studies involving the penetration of components and metabolism of substrates in situ. A method is described here in which plaque is formed in vivo under normal (or experimental) conditions using a collection device which can be removed from the mouth after a specified time without physical disturbance to the plaque biofilm, permitting site-specific analysis or exposure of the undisturbed plaque to experimental conditions in vitro. Microbiological analysis revealed plaque flora which was similar to that reported from many natural sources. Analytical data can be related to plaque volume rather than weight. Using this device, plaque fluoride concentrations have been shown to vary with plaque depth and in vitro short-term exposure to radiolabelled components may be carried out, permitting important conclusions to be drawn regarding the site-specific composition and dynamics of dental plaque.

  10. 76 FR 7712 - Clothianidin; Time-Limited Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... registration of this seed treatment. Valent has requested an experimental use permit and this tolerance to... evidence of increased quantitative or qualitative susceptibility of rat or rabbit fetuses following in utero exposure to clothianidin in developmental studies; however, increased quantitative susceptibility...

  11. Customized Molecular Phenotyping by Quantitative Gene Expression and Pattern Recognition Analysis

    PubMed Central

    Akilesh, Shreeram; Shaffer, Daniel J.; Roopenian, Derry

    2003-01-01

    Description of the molecular phenotypes of pathobiological processes in vivo is a pressing need in genomic biology. We have implemented a high-throughput real-time PCR strategy to establish quantitative expression profiles of a customized set of target genes. It enables rapid, reproducible data acquisition from limited quantities of RNA, permitting serial sampling of mouse blood during disease progression. We developed an easy to use statistical algorithm—Global Pattern Recognition—to readily identify genes whose expression has changed significantly from healthy baseline profiles. This approach provides unique molecular signatures for rheumatoid arthritis, systemic lupus erythematosus, and graft versus host disease, and can also be applied to defining the molecular phenotype of a variety of other normal and pathological processes. PMID:12840047

  12. Current status of nuclear cardiology: a limited review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Botvinick, E.H.; Dae, M.; Hattner, R.S.

    1985-11-01

    To summarize the current status of nuclear cardiology, the authors will focus on areas that the emphasize the specific advantages of nuclear cardiology methods: (a) their benign, noninvasive nature, (b) their pathophysiologic nature, and (c) the ease of their computer manipulation and analysis, permitting quantitative evaluation. The areas covered include: (a) blood pool scintigraphy and parametric imaging, (b) pharmacologic intervention for the diagnosis of ischemic heart disease, (c) scintigraphic studies for the diagnosis and prognosis of coronary artery disease, and (d) considerations of cost effectiveness.

  13. Corrosion Control through a Better Understanding of the Metallic Substrate/Organic Coating/Interface.

    DTIC Science & Technology

    1982-12-01

    run to run. A Karl Fischer automatic titrimeter has been ordered to enable routine analysis of water in both the inlet and exit streams to determine...Block-Styrene)," M.S. Thesis, Chemical Engineering, June 1982, by D. E. Zurawski. "Electron Optical Methods and the Study of Corrosion," M.S. Thesis...interface as viewed through a thin transparent metal deposited onto glass. The latter method will permit quantitative studies of the corrosion and

  14. Advanced body composition assessment: from body mass index to body composition profiling.

    PubMed

    Borga, Magnus; West, Janne; Bell, Jimmy D; Harvey, Nicholas C; Romu, Thobias; Heymsfield, Steven B; Dahlqvist Leinhard, Olof

    2018-06-01

    This paper gives a brief overview of common non-invasive techniques for body composition analysis and a more in-depth review of a body composition assessment method based on fat-referenced quantitative MRI. Earlier published studies of this method are summarized, and a previously unpublished validation study, based on 4753 subjects from the UK Biobank imaging cohort, comparing the quantitative MRI method with dual-energy X-ray absorptiometry (DXA) is presented. For whole-body measurements of adipose tissue (AT) or fat and lean tissue (LT), DXA and quantitative MRIs show excellent agreement with linear correlation of 0.99 and 0.97, and coefficient of variation (CV) of 4.5 and 4.6 per cent for fat (computed from AT) and LT, respectively, but the agreement was found significantly lower for visceral adipose tissue, with a CV of >20 per cent. The additional ability of MRI to also measure muscle volumes, muscle AT infiltration and ectopic fat, in combination with rapid scanning protocols and efficient image analysis tools, makes quantitative MRI a powerful tool for advanced body composition assessment. © American Federation for Medical Research (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  16. Digital color analysis of color-ratio composite LANDSAT scenes. [Nevada

    NASA Technical Reports Server (NTRS)

    Raines, G. L.

    1977-01-01

    A method is presented that can be used to calculate approximate Munsell coordinates of the colors produced by making a color composite from three registered images. Applied to the LANDSAT MSS data of the Goldfield, Nevada, area, this method permits precise and quantitative definition of the limonitic areas originally observed in a LANDSAT color ratio composite. In addition, areas of transported limonite can be discriminated from the limonite in the hydrothermally altered areas of the Goldfield mining district. From the analysis, the numerical distinction between limonitic and nonlimonitic ground is generally less than 3% using the LANDSAT bands and as much as 8% in ratios of LANDSAT MSS bands.

  17. Ordered Rape: A Principal-Agent Analysis of Wartime Sexual Violence in the DR Congo.

    PubMed

    Schneider, Gerald; Banholzer, Lilli; Albarracin, Laura

    2015-11-01

    Policy makers and academics often contend that organizational anarchy permits soldiers to perpetrate sexual violence. A recent United Nations report supports this thesis especially with regard to the massive sexual abuse in the Congolese civil war. We challenge the anarchy argument and maintain, based on a principal-agent framework, that opportunistic military commanders can order their soldiers to rape through the use of sanctions and rewards. Our qualitative and quantitative analysis of a survey of 96 Congolese ex-soldiers shows that ordered rape is more likely in organizations where soldiers fear punishment and in which commanders distribute drugs as stimulants. © The Author(s) 2015.

  18. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  19. Why physics needs mathematics

    NASA Astrophysics Data System (ADS)

    Rohrlich, Fritz

    2011-12-01

    Classical and the quantum mechanical sciences are in essential need of mathematics. Only thus can the laws of nature be formulated quantitatively permitting quantitative predictions. Mathematics also facilitates extrapolations. But classical and quantum sciences differ in essential ways: they follow different laws of logic, Aristotelian and non-Aristotelian logics, respectively. These are explicated.

  20. Proteomic analysis of hair shafts from monozygotic twins: Expression profiles and genetically variant peptides.

    PubMed

    Wu, Pei-Wen; Mason, Katelyn E; Durbin-Johnson, Blythe P; Salemi, Michelle; Phinney, Brett S; Rocke, David M; Parker, Glendon J; Rice, Robert H

    2017-07-01

    Forensic association of hair shaft evidence with individuals is currently assessed by comparing mitochondrial DNA haplotypes of reference and casework samples, primarily for exclusionary purposes. Present work tests and validates more recent proteomic approaches to extract quantitative transcriptional and genetic information from hair samples of monozygotic twin pairs, which would be predicted to partition away from unrelated individuals if the datasets contain identifying information. Protein expression profiles and polymorphic, genetically variant hair peptides were generated from ten pairs of monozygotic twins. Profiling using the protein tryptic digests revealed that samples from identical twins had typically an order of magnitude fewer protein expression differences than unrelated individuals. The data did not indicate that the degree of difference within twin pairs increased with age. In parallel, data from the digests were used to detect genetically variant peptides that result from common nonsynonymous single nucleotide polymorphisms in genes expressed in the hair follicle. Compilation of the variants permitted sorting of the samples by hierarchical clustering, permitting accurate matching of twin pairs. The results demonstrate that genetic differences are detectable by proteomic methods and provide a framework for developing quantitative statistical estimates of personal identification that increase the value of hair shaft evidence. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  2. Untargeted Metabolic Quantitative Trait Loci Analyses Reveal a Relationship between Primary Metabolism and Potato Tuber Quality1[W][OA

    PubMed Central

    Carreno-Quintero, Natalia; Acharjee, Animesh; Maliepaard, Chris; Bachem, Christian W.B.; Mumm, Roland; Bouwmeester, Harro; Visser, Richard G.F.; Keurentjes, Joost J.B.

    2012-01-01

    Recent advances in -omics technologies such as transcriptomics, metabolomics, and proteomics along with genotypic profiling have permitted dissection of the genetics of complex traits represented by molecular phenotypes in nonmodel species. To identify the genetic factors underlying variation in primary metabolism in potato (Solanum tuberosum), we have profiled primary metabolite content in a diploid potato mapping population, derived from crosses between S. tuberosum and wild relatives, using gas chromatography-time of flight-mass spectrometry. In total, 139 polar metabolites were detected, of which we identified metabolite quantitative trait loci for approximately 72% of the detected compounds. In order to obtain an insight into the relationships between metabolic traits and classical phenotypic traits, we also analyzed statistical associations between them. The combined analysis of genetic information through quantitative trait locus coincidence and the application of statistical learning methods provide information on putative indicators associated with the alterations in metabolic networks that affect complex phenotypic traits. PMID:22223596

  3. Biopharmaceutical production: Applications of surface plasmon resonance biosensors.

    PubMed

    Thillaivinayagalingam, Pranavan; Gommeaux, Julien; McLoughlin, Michael; Collins, David; Newcombe, Anthony R

    2010-01-15

    Surface plasmon resonance (SPR) permits the quantitative analysis of therapeutic antibody concentrations and impurities including bacteria, Protein A, Protein G and small molecule ligands leached from chromatography media. The use of surface plasmon resonance has gained popularity within the biopharmaceutical industry due to the automated, label free, real time interaction that may be exploited when using this method. The application areas to assess protein interactions and develop analytical methods for biopharmaceutical downstream process development, quality control, and in-process monitoring are reviewed. 2009 Elsevier B.V. All rights reserved.

  4. Stripline fast faraday cup for measuring GHz structure of ion beams

    DOEpatents

    Bogaty, John M.

    1992-01-01

    The Stripline Fast Faraday Cup is a device which is used to quantitatively and qualitatively measure gigahertz time structure characteristics of ion beams with energies up to at least 30 Mev per nucleon. A stripline geometry is employed in conjunction with an electrostatic screen and a Faraday cup to provide for analysis of the structural characteristics of an ion beam. The stripline geometry allows for a large reduction in the size of the instrument while the electrostatic screen permits measurements of the properties associated with low speed ion beams.

  5. Blackboard architecture for medical image interpretation

    NASA Astrophysics Data System (ADS)

    Davis, Darryl N.; Taylor, Christopher J.

    1991-06-01

    There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.

  6. Improved hydrophilic interaction chromatography LC/MS of heparinoids using a chip with postcolumn makeup flow.

    PubMed

    Staples, Gregory O; Naimy, Hicham; Yin, Hongfeng; Kileen, Kevin; Kraiczek, Karsten; Costello, Catherine E; Zaia, Joseph

    2010-01-15

    Heparan sulfate (HS) and heparin are linear, heterogeneous carbohydrates of the glycosaminoglycan (GAG) family that are modified by N-acetylation, N-sulfation, O-sulfation, and uronic acid epimerization. HS interacts with growth factors in the extracellular matrix, thereby modulating signaling pathways that govern cell growth, development, differentiation, proliferation, and adhesion. High-performance liquid chromatography (HPLC)-chip-based hydrophilic interaction liquid chromatography/mass spectrometry has emerged as a method for analyzing the domain structure of GAGs. However, analysis of highly sulfated GAG structures decasaccharide or larger in size has been limited by spray instability in the negative-ion mode. This report demonstrates that addition of postcolumn makeup flow to the amide-HPLC-chip configuration permits robust and reproducible analysis of extended GAG domains (up to degree of polymerization 18) from HS and heparin. This platform provides quantitative information regarding the oligosaccharide profile, degree of sulfation, and nonreducing chain termini. It is expected that this technology will enable quantitative, comparative glycomics profiling of extended GAG oligosaccharide domains of functional interest.

  7. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  8. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luria, Paolo; Aspinall, Peter A

    2003-08-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less

  9. Qualification of computerized monitoring systems in a cell therapy facility compliant with the good manufacturing practices.

    PubMed

    Del Mazo-Barbara, Anna; Mirabel, Clémentine; Nieto, Valentín; Reyes, Blanca; García-López, Joan; Oliver-Vila, Irene; Vives, Joaquim

    2016-09-01

    Computerized systems (CS) are essential in the development and manufacture of cell-based medicines and must comply with good manufacturing practice, thus pushing academic developers to implement methods that are typically found within pharmaceutical industry environments. Qualitative and quantitative risk analyses were performed by Ishikawa and Failure Mode and Effects Analysis, respectively. A process for qualification of a CS that keeps track of environmental conditions was designed and executed. The simplicity of the Ishikawa analysis permitted to identify critical parameters that were subsequently quantified by Failure Mode Effects Analysis, resulting in a list of test included in the qualification protocols. The approach presented here contributes to simplify and streamline the qualification of CS in compliance with pharmaceutical quality standards.

  10. Quantitation by Portable Gas Chromatography: Mass Spectrometry of VOCs Associated with Vapor Intrusion

    PubMed Central

    Fair, Justin D.; Bailey, William F.; Felty, Robert A.; Gifford, Amy E.; Shultes, Benjamin; Volles, Leslie H.

    2010-01-01

    Development of a robust reliable technique that permits for the rapid quantitation of volatile organic chemicals is an important first step to remediation associated with vapor intrusion. This paper describes the development of an analytical method that allows for the rapid and precise identification and quantitation of halogenated and nonhalogenated contaminants commonly found within the ppbv level at sites where vapor intrusion is a concern. PMID:20885969

  11. Automated fluorescent miscroscopic image analysis of PTBP1 expression in glioma

    PubMed Central

    Becker, Aline; Elder, Brad; Puduvalli, Vinay; Winter, Jessica; Gurcan, Metin

    2017-01-01

    Multiplexed immunofluorescent testing has not entered into diagnostic neuropathology due to the presence of several technical barriers, amongst which includes autofluorescence. This study presents the implementation of a methodology capable of overcoming the visual challenges of fluorescent microscopy for diagnostic neuropathology by using automated digital image analysis, with long term goal of providing unbiased quantitative analyses of multiplexed biomarkers for solid tissue neuropathology. In this study, we validated PTBP1, a putative biomarker for glioma, and tested the extent to which immunofluorescent microscopy combined with automated and unbiased image analysis would permit the utility of PTBP1 as a biomarker to distinguish diagnostically challenging surgical biopsies. As a paradigm, we utilized second resections from patients diagnosed either with reactive brain changes (pseudoprogression) and recurrent glioblastoma (true progression). Our image analysis workflow was capable of removing background autofluorescence and permitted quantification of DAPI-PTBP1 positive cells. PTBP1-positive nuclei, and the mean intensity value of PTBP1 signal in cells. Traditional pathological interpretation was unable to distinguish between groups due to unacceptably high discordance rates amongst expert neuropathologists. Our data demonstrated that recurrent glioblastoma showed more DAPI-PTBP1 positive cells and a higher mean intensity value of PTBP1 signal compared to resections from second surgeries that showed only reactive gliosis. Our work demonstrates the potential of utilizing automated image analysis to overcome the challenges of implementing fluorescent microscopy in diagnostic neuropathology. PMID:28282372

  12. Visualization and quantitative analysis of extrachromosomal telomere-repeat DNA in individual human cells by Halo-FISH

    PubMed Central

    Komosa, Martin; Root, Heather; Meyn, M. Stephen

    2015-01-01

    Current methods for characterizing extrachromosomal nuclear DNA in mammalian cells do not permit single-cell analysis, are often semi-quantitative and frequently biased toward the detection of circular species. To overcome these limitations, we developed Halo-FISH to visualize and quantitatively analyze extrachromosomal DNA in single cells. We demonstrate Halo-FISH by using it to analyze extrachromosomal telomere-repeat (ECTR) in human cells that use the Alternative Lengthening of Telomeres (ALT) pathway(s) to maintain telomere lengths. We find that GM847 and VA13 ALT cells average ∼80 detectable G/C-strand ECTR DNA molecules/nucleus, while U2OS ALT cells average ∼18 molecules/nucleus. In comparison, human primary and telomerase-positive cells contain <5 ECTR DNA molecules/nucleus. ECTR DNA in ALT cells exhibit striking cell-to-cell variations in number (<20 to >300), range widely in length (<1 to >200 kb) and are composed of primarily G- or C-strand telomere-repeat DNA. Halo-FISH enables, for the first time, the simultaneous analysis of ECTR DNA and chromosomal telomeres in a single cell. We find that ECTR DNA comprises ∼15% of telomere-repeat DNA in GM847 and VA13 cells, but <4% in U2OS cells. In addition to its use in ALT cell analysis, Halo-FISH can facilitate the study of a wide variety of extrachromosomal DNA in mammalian cells. PMID:25662602

  13. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    PubMed

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  15. Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.

    2010-12-01

    Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a combined SSA, wherein climate change-induced productivity shocks are permitted to interact with the uncertain economic parameters. This permits us to examine potential interactions between the two sources of uncertainty.

  16. A Quantitative Study of Oxygen as a Metabolic Regulator

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; LaManna, Joseph C.; Cabrera, Marco E.

    1999-01-01

    An acute reduction in oxygen (O2) delivery to a tissue is generally associated with a decrease in phosphocreatine, increases in ADP, NADH/NAD, and inorganic phosphate, increased rates of glycolysis and lactate production, and reduced rates of pyruvate and fatty acid oxidation. However, given the complexity of the human bioenergetic system and its components, it is difficult to determine quantitatively how cellular metabolic processes interact to maintain ATP homeostasis during stress (e.g., hypoxia, ischemia, and exercise). Of special interest is the determination of mechanisms relating tissue oxygenation to observed metabolic responses at the tissue, organ, and whole body levels and the quantification of how changes in tissue O2 availability affect the pathways of ATP synthesis and the metabolites that control these pathways. In this study, we extend a previously developed mathematical model of human bioenergetics to provide a physicochemical framework that permits quantitative understanding of O2 as a metabolic regulator. Specifically, the enhancement permits studying the effects of variations in tissue oxygenation and in parameters controlling the rate of cellular respiration on glycolysis, lactate production, and pyruvate oxidation. The whole body is described as a bioenergetic system consisting of metabolically distinct tissue/organ subsystems that exchange materials with the blood. In order to study the dynamic response of each subsystem to stimuli, we solve the ordinary differential equations describing the temporal evolution of metabolite levels, given the initial concentrations. The solver used in the present study is the packaged code LSODE, as implemented in the NASA Lewis kinetics and sensitivity analysis code, LSENS. A major advantage of LSENS is the efficient procedures supporting systematic sensitivity analysis, which provides the basic methods for studying parameter sensitivities (i.e., changes in model behavior due to parameter variation). Sensitivity analysis establishes relationships between model predictions and problem parameters (i.e., initial concentrations, rate coefficients, etc). It helps determine the effects of uncertainties or changes in these input parameters on the predictions, which ultimately are compared with experimental observations in order to validate the model. Sensitivity analysis can identify parameters that must be determined accurately because of their large effect on the model predictions and parameters that need not be known with great precision because they have little or no effect on the solution. This capability may prove to be important in optimizing the design of experiments, thereby reducing the use of animals. This approach can be applied to study the metabolic effects of reduced oxygen delivery to cardiac muscle due to local myocardial ischemia and the effects of acute hypoxia on brain metabolism. Other important applications of sensitivity analysis include identification of quantitatively relevant pathways and biochemical species within an overall mechanism, when examining the effects of a genetic anomaly or pathological state on energetic system components and whole system behavior.

  17. High-definition optical coherence tomography intrinsic skin ageing assessment in women: a pilot study.

    PubMed

    Boone, M A L M; Suppa, M; Marneffe, A; Miyamoto, M; Jemec, G B E; Del Marmol, V

    2015-10-01

    Several non-invasive two-dimensional techniques with different lateral resolution and measurable depth range have proved to be useful in assessing and quantifying morphological changes in skin ageing. Among these, only in vivo microscopy techniques permit histometric measurements in vivo. Qualitative and quantitative assessment of chronological (intrinsic) age-related (IAR) morphological changes of epidermis, dermo-epidermal junction (DEJ), papillary dermis (PD), papillary-reticular dermis junction and reticular dermis (RD) have been performed by high-definition optical coherence tomography in real time 3-D. HD-OCT images were taken at the internal site of the right upper arm. Qualitative HD-OCT IAR descriptors were reported at skin surface, at epidermal layer, DEJ, PD and upper RD. Quantitative evaluation of age-related compaction and backscattered intensity or brightness of different skin layers was performed by using the plugin plot z-axis profile of ImageJ(®) software permitting intensity assessment of HD-OCT (DICOM) images (3-D images). Analysis was in blind from all clinical information. Sixty, fair-skinned (Fitzpatrick types I-III) healthy females were analysed retrospectively in this study. The subjects belonged to three age groups: twenty in group I aged 20-39, twenty in group II aged 40-59 and twenty in group III aged 60-79. Only intrinsic ageing in women has been studied. Significant age-related qualitative and quantitative differences could be noticed. IAR changes in dermal matrix fibers morphology/organisation and in microvasculature were observed. The brightness and compaction of the different skin layers increased significantly with intrinsic skin ageing. The depth of visibility of fibers in RD increased significantly in the older age group. In conclusion, HD-OCT allows 3-D in vivo and real time qualitative and quantitative assessment of chronological (intrinsic) age-related morphological skin changes at high resolution from skin surface to a depth of the superficial reticular dermis.

  18. Modelisation and distribution of neutron flux in radium-beryllium source (226Ra-Be)

    NASA Astrophysics Data System (ADS)

    Didi, Abdessamad; Dadouch, Ahmed; Jai, Otman

    2017-09-01

    Using the Monte Carlo N-Particle code (MCNP-6), to analyze the thermal, epithermal and fast neutron fluxes, of 3 millicuries of radium-beryllium, for determine the qualitative and quantitative of many materials, using method of neutron activation analysis. Radium-beryllium source of neutron is established to practical work and research in nuclear field. The main objective of this work was to enable us harness the profile flux of radium-beryllium irradiation, this theoretical study permits to discuss the design of the optimal irradiation and performance for increased the facility research and education of nuclear physics.

  19. Quantitative polarized light microscopy using spectral multiplexing interferometry.

    PubMed

    Li, Chengshuai; Zhu, Yizheng

    2015-06-01

    We propose an interferometric spectral multiplexing method for measuring birefringent specimens with simple configuration and high sensitivity. The retardation and orientation of sample birefringence are simultaneously encoded onto two spectral carrier waves, generated interferometrically by a birefringent crystal through polarization mixing. A single interference spectrum hence contains sufficient information for birefringence determination, eliminating the need for mechanical rotation or electrical modulation. The technique is analyzed theoretically and validated experimentally on cellulose film. System simplicity permits the possibility of mitigating system birefringence background. Further analysis demonstrates the technique's exquisite sensitivity as high as ∼20  pm for retardation measurement.

  20. Quantitation of fixative-induced morphologic and antigenic variation in mouse and human breast cancers

    PubMed Central

    Cardiff, Robert D; Hubbard, Neil E; Engelberg, Jesse A; Munn, Robert J; Miller, Claramae H; Walls, Judith E; Chen, Jane Q; Velásquez-García, Héctor A; Galvez, Jose J; Bell, Katie J; Beckett, Laurel A; Li, Yue-Ju; Borowsky, Alexander D

    2013-01-01

    Quantitative Image Analysis (QIA) of digitized whole slide images for morphometric parameters and immunohistochemistry of breast cancer antigens was used to evaluate the technical reproducibility, biological variability, and intratumoral heterogeneity in three transplantable mouse mammary tumor models of human breast cancer. The relative preservation of structure and immunogenicity of the three mouse models and three human breast cancers was also compared when fixed with representatives of four distinct classes of fixatives. The three mouse mammary tumor cell models were an ER + /PR + model (SSM2), a Her2 + model (NDL), and a triple negative model (MET1). The four breast cancer antigens were ER, PR, Her2, and Ki67. The fixatives included examples of (1) strong cross-linkers, (2) weak cross-linkers, (3) coagulants, and (4) combination fixatives. Each parameter was quantitatively analyzed using modified Aperio Technologies ImageScope algorithms. Careful pre-analytical adjustments to the algorithms were required to provide accurate results. The QIA permitted rigorous statistical analysis of results and grading by rank order. The analyses suggested excellent technical reproducibility and confirmed biological heterogeneity within each tumor. The strong cross-linker fixatives, such as formalin, consistently ranked higher than weak cross-linker, coagulant and combination fixatives in both the morphometric and immunohistochemical parameters. PMID:23399853

  1. TROSY-based z-exchange spectroscopy: application to the determination of the activation energy for intermolecular protein translocation between specific sites on different DNA molecules.

    PubMed

    Sahu, Debashish; Clore, G Marius; Iwahara, Junji

    2007-10-31

    A two-dimensional TROSY-based z-exchange 1H-15N correlation experiment for the quantitative analysis of kinetic processes in the slow exchange regime is presented. The pulse scheme converts the product operator terms Nz into 2NzHz and 2NzHz into -Nz in the middle of the z-mixing period, thereby suppressing the buildup of spurious semi-TROSY peaks arising from the different relaxation rates for the Nz and 2NzHz terms and simplifying the behavior of longitudinal magnetization for an exchanging system during the mixing period. Theoretical considerations and experimental data demonstrate that the TROSY-based z-exchange experiment permits quantitative determination of rate constants using the same procedure as that for the conventional non-TROSY 15Nz-exchange experiment. Line narrowing as a consequence of the use of the TROSY principle makes the method particularly suitable for kinetic studies at low temperature, thereby permitting activation energies to be extracted from data acquired over a wider temperature range. We applied this method to the investigation of the process whereby the HoxD9 homeodomain translocates between specific target sites on different DNA molecules via a direct transfer mechanism without going through the intermediary of free protein. The activation enthalpy for intermolecular translocation was determined to be 17 kcal/mol.

  2. 75 FR 73000 - Corporate Credit Unions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ... suitable only when it: Is free from bias; Permits reasonably consistent qualitative and quantitative... injections and launched liquidity and share guarantee programs designed to stabilize the corporate system and...

  3. [The quantitative testing of V617F mutation in gen JAK2 using pyrosequencing technique].

    PubMed

    Dunaeva, E A; Mironov, K O; Dribnokhodova, T E; Subbotina, E E; Bashmakova; Ol'hovskiĭ, I A; Shipulin, G A

    2014-11-01

    The somatic mutation V617F in gen JAK2 is a frequent cause of chronic myeloprolific diseases not conditioned by BCR/ABL mutation. The quantitative testing of relative percentage of mutant allele can be used in establishing severity of disease and its prognosis and in prescription of remedy inhibiting activity of JAK2. To quantitatively test mutation the pyrosequencing technique was applied. The developed technique permits detecting and quantitatively, testing percentage of mutation fraction since 7%. The "gray zone" is presented by samples with percentage of mutant allele from 4% to 7%. The dependence of expected percentage of mutant fraction in analyzed sample from observed value of signal is described by equation of line with regression coefficients y = - 0.97, x = -1.32 and at that measurement uncertainty consists ± 0.7. The developed technique is approved officially on clinical material from 192 patients with main forms of myeloprolific diseases not conditioned by BCR/ABL mutation. It was detected 64 samples with mautant fraction percentage from 13% to 91%. The developed technique permits implementing monitoring of therapy of myeloprolific diseases and facilitates to optimize tactics of treatment.

  4. Method for quantitative determination and separation of trace amounts of chemical elements in the presence of large quantities of other elements having the same atomic mass

    DOEpatents

    Miller, C.M.; Nogar, N.S.

    1982-09-02

    Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.

  5. Inertial Sensor-Based Motion Analysis of Lower Limbs for Rehabilitation Treatments

    PubMed Central

    Sun, Tongyang; Duan, Lihong; Wang, Yulong

    2017-01-01

    The hemiplegic rehabilitation state diagnosing performed by therapists can be biased due to their subjective experience, which may deteriorate the rehabilitation effect. In order to improve this situation, a quantitative evaluation is proposed. Though many motion analysis systems are available, they are too complicated for practical application by therapists. In this paper, a method for detecting the motion of human lower limbs including all degrees of freedom (DOFs) via the inertial sensors is proposed, which permits analyzing the patient's motion ability. This method is applicable to arbitrary walking directions and tracks of persons under study, and its results are unbiased, as compared to therapist qualitative estimations. Using the simplified mathematical model of a human body, the rotation angles for each lower limb joint are calculated from the input signals acquired by the inertial sensors. Finally, the rotation angle versus joint displacement curves are constructed, and the estimated values of joint motion angle and motion ability are obtained. The experimental verification of the proposed motion detection and analysis method was performed, which proved that it can efficiently detect the differences between motion behaviors of disabled and healthy persons and provide a reliable quantitative evaluation of the rehabilitation state. PMID:29065575

  6. Video fluoroscopic techniques for the study of Oral Food Processing

    PubMed Central

    Matsuo, Koichiro; Palmer, Jeffrey B.

    2016-01-01

    Food oral processing and pharyngeal food passage cannot be observed directly from the outside of the body without instrumental methods. Videofluoroscopy (x-ray video recording) reveals the movement of oropharyngeal anatomical structures in two dimensions. By adding a radiopaque contrast medium, the motion and shape of the food bolus can be also visualized, providing critical information about the mechanisms of eating, drinking, and swallowing. For quantitative analysis of the kinematics of oral food processing, radiopaque markers are attached to the teeth, tongue or soft palate. This approach permits kinematic analysis with a variety of textures and consistencies, both solid and liquid. Fundamental mechanisms of food oral processing are clearly observed with videofluoroscopy in lateral and anteroposterior projections. PMID:27213138

  7. A Taylor weak-statement algorithm for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Kim, J. W.

    1987-01-01

    Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.

  8. Analysis and Evaluation of Processes and Equipment in Tasks 2 and 4 of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Wolf, M.

    1979-01-01

    To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.

  9. Analysis of mutational spectra by denaturant capillary electrophoresis

    PubMed Central

    Ekstrøm, Per O.; Khrapko, Konstantin; Li-Sucholeiki, Xiao-Cheng; Hunter, Ian W.; Thilly, William G.

    2009-01-01

    Numbers and kinds of point mutant within DNA from cells, tissues and human population may be discovered for nearly any 75–250bp DNA sequence. High fidelity DNA amplification incorporating a thermally stable DNA “clamp” is followed by separation by denaturing capillary electrophoresis (DCE). DCE allows for peak collection and verification sequencing. DCE in a mode of cycling temperature, e.g.+/− 5°C, CyDCE, permits high resolution of mutant sequences using computer defined analytes without preliminary optimization experiments. DNA sequencers have been modified to permit higher throughput CyDCE and a massively parallel,~25,000 capillary system, has been designed for pangenomic scans in large human populations. DCE has been used to define quantitative point mutational spectra for study a wide variety of genetic phenomena: errors of DNA polymerases, mutations induced in human cells by chemicals and irradiation, testing of human gene-common disease associations and the discovery of origins of point mutations in human development and carcinogenesis. PMID:18600220

  10. Modelling the structure of sludge aggregates

    PubMed Central

    Smoczyński, Lech; Ratnaweera, Harsha; Kosobucka, Marta; Smoczyński, Michał; Kalinowski, Sławomir; Kvaal, Knut

    2016-01-01

    ABSTRACT The structure of sludge is closely associated with the process of wastewater treatment. Synthetic dyestuff wastewater and sewage were coagulated using the PAX and PIX methods, and electro-coagulated on aluminium electrodes. The processes of wastewater treatment were supported with an organic polymer. The images of surface structures of the investigated sludge were obtained using scanning electron microscopy (SEM). The software image analysis permitted obtaining plots log A vs. log P, wherein A is the surface area and P is the perimeter of the object, for individual objects comprised in the structure of the sludge. The resulting database confirmed the ‘self-similarity’ of the structural objects in the studied groups of sludge, which enabled calculating their fractal dimension and proposing models for these objects. A quantitative description of the sludge aggregates permitted proposing a mechanism of the processes responsible for their formation. In the paper, also, the impact of the structure of the investigated sludge on the process of sedimentation, and dehydration of the thickened sludge after sedimentation, was discussed. PMID:26549812

  11. Quantitative Analyses of the Modes of Deformation in Engineering Thermoplastics

    NASA Astrophysics Data System (ADS)

    Landes, B. G.; Bubeck, R. A.; Scott, R. L.; Heaney, M. D.

    1998-03-01

    Synchrotron-based real-time small-angle X-ray scattering (RTSAXS) studies have been performed on rubber-toughened engineering thermoplastics with amorphous and semi-crystalline matrices. Scattering patterns are measured at successive time intervals of 3 ms were analyzed to determine the plastic strain due to crazing. Simultaneous measurements of the absorption of the primary beam by the sample permits the total plastic strain to be concurrently computed. The plastic strain due to other deformation mechanisms (e.g., particle cavitation and macroscopic shear yield can be determined from the difference between the total and craze-derived plastic strains. The contribution from macroscopic shear deformation can be determined from video-based optical data measured simultaneously with the X-ray data. These types of time-resolved experiments result in the generation of prodigious quantities of data, the analysis of which can considerably delay the determination of key results. A newly developed software package that runs in WINDOWSa 95 permits the rapid analysis of the relative contributions of the deformation modes from these time-resolved experiments. Examples of using these techniques on ABS-type and QUESTRAa syndiotactic polystyrene type engineering resins will be given.

  12. A programmable light engine for quantitative single molecule TIRF and HILO imaging.

    PubMed

    van 't Hoff, Marcel; de Sars, Vincent; Oheim, Martin

    2008-10-27

    We report on a simple yet powerful implementation of objective-type total internal reflection fluorescence (TIRF) and highly inclined and laminated optical sheet (HILO, a type of dark-field) illumination. Instead of focusing the illuminating laser beam to a single spot close to the edge of the microscope objective, we are scanning during the acquisition of a fluorescence image the focused spot in a circular orbit, thereby illuminating the sample from various directions. We measure parameters relevant for quantitative image analysis during fluorescence image acquisition by capturing an image of the excitation light distribution in an equivalent objective backfocal plane (BFP). Operating at scan rates above 1 MHz, our programmable light engine allows directional averaging by circular spinning the spot even for sub-millisecond exposure times. We show that restoring the symmetry of TIRF/HILO illumination reduces scattering and produces an evenly lit field-of-view that affords on-line analysis of evanescnt-field excited fluorescence without pre-processing. Utilizing crossed acousto-optical deflectors, our device generates arbitrary intensity profiles in BFP, permitting variable-angle, multi-color illumination, or objective lenses to be rapidly exchanged.

  13. Combinatorial peptide libraries and biometric score matrices permit the quantitative analysis of specific and degenerate interactions between clonotypic TCR and MHC peptide ligands.

    PubMed

    Zhao, Y; Gran, B; Pinilla, C; Markovic-Plese, S; Hemmer, B; Tzou, A; Whitney, L W; Biddison, W E; Martin, R; Simon, R

    2001-08-15

    The interaction of TCRs with MHC peptide ligands can be highly flexible, so that many different peptides are recognized by the same TCR in the context of a single restriction element. We provide a quantitative description of such interactions, which allows the identification of T cell epitopes and molecular mimics. The response of T cell clones to positional scanning synthetic combinatorial libraries is analyzed with a mathematical approach that is based on a model of independent contribution of individual amino acids to peptide Ag recognition. This biometric analysis compares the information derived from these libraries composed of trillions of decapeptides with all the millions of decapeptides contained in a protein database to rank and predict the most stimulatory peptides for a given T cell clone. We demonstrate the predictive power of the novel strategy and show that, together with gene expression profiling by cDNA microarrays, it leads to the identification of novel candidate autoantigens in the inflammatory autoimmune disease, multiple sclerosis.

  14. 78 FR 7447 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... Tapiridae Ursidae Accipitridae Anatidae (does not include Hawaiian duck or Hawaiian goose) Bucerotidae...

  15. Analysis of spectra using correlation functions

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H.

    1988-01-01

    A novel method is presented for the quantitative analysis of spectra based on the properties of the cross correlation between a real spectrum and either a numerical synthesis or laboratory simulation. A new goodness-of-fit criterion called the heteromorphic coefficient H is proposed that has the property of being zero when a fit is achieved and varying smoothly through zero as the iteration proceeds, providing a powerful tool for automatic or near-automatic analysis. It is also shown that H can be rendered substantially noise-immune, permitting the analysis of very weak spectra well below the apparent noise level and, as a byproduct, providing Doppler shift and radial velocity information with excellent precision. The technique is in regular use in the Atmospheric Trace Molecule Spectroscopy (ATMOS) project and operates in an interactive, realtime computing environment with turn-around times of a few seconds or less.

  16. 75 FR 22162 - Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-27

    ... agency decisions are: (1) Those supported by quantitative information or studies; and (2) Those that... to import the sport- hunted trophy of one male bontebok (Damaliscus pygargus pygargus) culled from a...

  17. Analysis of molecular assemblies by flow cytometry: determinants of Gi1 and by binding

    NASA Astrophysics Data System (ADS)

    Sarvazyan, Noune A.; Neubig, Richard R.

    1998-05-01

    We report here a novel application of flow cytometry for the quantitative analysis of the high affinity interaction between membrane proteins both in detergent solutions and when reconstituted into lipid vesicles. The approach is further advanced to permit the analysis of binding to expressed protein complexes in native cell membranes. The G protein heterotrimer signal transduction function links the extracellularly activated transmembrane receptors and intracellular effectors. Upon activation, (alpha) and (beta) (gamma) subunits of G protein undergo a dissociation/association cycle on the cell membrane interface. The binding parameters of solubilized G protein (alpha) and (beta) (gamma) subunits have been defined but little is known quantitatively about their interactions in the membrane. Using a novel flow cytometry approach, the binding of low nanomolar concentrations of fluorescein-labeled G(alpha) i1 (F- (alpha) ) to (beta) (gamma) both in detergent solution and in a lipid environment was quantitatively compared. Unlabeled (beta) $gama reconstituted in biotinylated phospholipid vesicles bound F-(alpha) tightly (Kd 6 - 12 nM) while the affinity for biotinylated-(beta) (gamma) in Lubrol was even higher (Kd of 2.9 nM). The application of this approach to proteins expressed in native cell membranes will advance our understanding of G protein function in context of receptor and effector interaction. More generally, this approach can be applied to study the interaction of any fluorescently labeled protein with a membrane protein which can be expressed in Sf9 plasma membranes.

  18. Detection, monitoring, and quantitative analysis of wildfires with the BIRD satellite

    NASA Astrophysics Data System (ADS)

    Oertel, Dieter A.; Briess, Klaus; Lorenz, Eckehard; Skrbek, Wolfgang; Zhukov, Boris

    2004-02-01

    Increasing concern about environment and interest to avoid losses led to growing demands on space borne fire detection, monitoring and quantitative parameter estimation of wildfires. The global change research community intends to quantify the amount of gaseous and particulate matter emitted from vegetation fires, peat fires and coal seam fires. The DLR Institute of Space Sensor Technology and Planetary Exploration (Berlin-Adlershof) developed a small satellite called BIRD (Bi-spectral Infrared Detection) which carries a sensor package specially designed for fire detection. BIRD was launched as a piggy-back satellite on October 22, 2001 with ISRO"s Polar Satellite Launch Vehicle (PSLV). It is circling the Earth on a polar and sun-synchronous orbit at an altitude of 572 km and it is providing unique data for detailed analysis of high temperature events on Earth surface. The BIRD sensor package is dedicated for high resolution and reliable fire recognition. Active fire analysis is possible in the sub-pixel domain. The leading channel for fire detection and monitoring is the MIR channel at 3.8 μm. The rejection of false alarms is based on procedures using MIR/NIR (Middle Infra Red/Near Infra Red) and MIR/TIR (Middle Infra Red/Thermal Infra Red) radiance ratio thresholds. Unique results of BIRD wildfire detection and analysis over fire prone regions in Australia and Asia will be presented. BIRD successfully demonstrates innovative fire recognition technology for small satellites which permit to retrieve quantitative characteristics of active burning wildfires, such as the equivalent fire temperature, fire area, radiative energy release, fire front length and fire front strength.

  19. 78 FR 73877 - Endangered Species; Marine Mammals; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-09

    ... decisions are: (1) Those supported by quantitative information or studies; and (2) Those that include... (does not include Hawaiian goose or Hawaiian duck) Cathartidae Gruidae Sturnidae (does not include...

  20. 33 CFR 325.1 - Applications for permits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...

  1. 33 CFR 325.1 - Applications for permits.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...

  2. 33 CFR 325.1 - Applications for permits.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...

  3. 33 CFR 325.1 - Applications for permits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...

  4. 33 CFR 325.1 - Applications for permits.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... assure that the potential applicant is fully aware of the substance (both quantitative and qualitative... Protection, Research and Sanctuaries Act of 1972, as amended, and sections 9 and 10 of the Rivers and Harbors...

  5. 76 FR 54480 - Endangered Species; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... sport- hunted trophy of one male bontebok (Damaliscus pygargus pygargus) culled from a captive herd...

  6. A synthesis theory for self-oscillating adaptive systems /SOAS/

    NASA Technical Reports Server (NTRS)

    Horowitz, I.; Smay, J.; Shapiro, A.

    1974-01-01

    A quantitative synthesis theory is presented for the Self-Oscillating Adaptive System (SOAS), whose nonlinear element has a static, odd character with hard saturation. The synthesis theory is based upon the quasilinear properties of the SOAS to forced inputs, which permits the extension of quantitative linear feedback theory to the SOAS. A reasonable definition of optimum design is shown to be the minimization of the limit cycle frequency. The great advantages of the SOAS is its zero sensitivity to pure gain changes. However, quasilinearity and control of the limit cycle amplitude at the system output, impose additional constraints which partially or completely cancel this advantage, depending on the numerical values of the design parameters. By means of narrow-band filtering, an additional factor is introduced which permits trade-off between filter complexity and limit cycle frequency minimization.

  7. Laser ablation-inductively coupled plasma mass spectrometry for the characterization of pigments in prehistoric rock art.

    PubMed

    Resano, Martin; García-Ruiz, Esperanza; Alloza, Ramiro; Marzo, Maria P; Vandenabeele, Peter; Vanhaecke, Frank

    2007-12-01

    In this work, several red-colored paintings of post-Paleolithic schematic style found in 10 different shelters in the vicinity of the Vero River (Huesca) were sampled and subjected to analysis by means of scanning electron microscopy-energy-dispersive X-ray spectrometry (SEM-EDX), Raman spectroscopy, and laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS). The goal of this research was to obtain meaningful information on the samples composition, in order to establish differences or similarities among them. The combined use of these techniques proved beneficial, as Raman data permitted structural information on the compounds present (hematite was identified as the main pigment, whereas calcite and gypsum are the main components of the substrate layer, as well as of the accretions that covered the pigments) to be obtained, while the quantitative values obtained by SEM were suitable for the use of Ca as internal reference during LA-ICPMS analysis. However, it was this latter technique that provided the most relevant data for fingerprinting purposes. The potential of this technique for obtaining spatially resolved information allowed the multielement quantitative analysis of the pigment layer, in spite of the presence of superficial accretions. The sensitivity of the technique permitted the determination of more than 40 elements present in a wide concentration range (from microgram per gram to 10% level) with minimum sample consumption (approximately 900 ng for each sample, corresponding to five replicates). Finally, in order to establish significant differences, only those elements showing a high correlation with Fe (As, Co, Mo, Sb, Tl, and Zr, in this case) were selected, as it is expected that these were truly present in the original pigment, while others could have migrated into the pigment layer throughout time. By using this information, it seems feasible to discriminate between various paint pots, as demonstrated for the samples under investigation.

  8. Detection of aspen/conifer forest mixes from multitemporal LANDSAT digital data. [Bear River Range, Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Merola, J. A.; Jaynes, R. A.; Harniss, R. O.

    1983-01-01

    Aspen, conifer and mixed aspen/conifer forests were mapped for a 15-quadrangle study area in the Utah-Idaho Bear River Range using LANDSAT multispectral scanner (MSS) data. The digital MSS data were utilized to devise quantitative indices which correlate with apparently stable and seral aspen forests. The extent to which a two-date LANDSAT MSS analysis may permit the delineation of different categories of aspen/conifer forest mix was explored. Multitemporal analyses of MSS data led to the identification of early, early to mid, mid to late, and late seral stages of aspen/conifer forest mixing.

  9. [The procedure for documentation of digital images in forensic medical histology].

    PubMed

    Putintsev, V A; Bogomolov, D V; Fedulova, M V; Gribunov, Iu P; Kul'bitskiĭ, B N

    2012-01-01

    This paper is devoted to the novel computer technologies employed in the studies of histological preparations. These technologies allow to visualize digital images, structurize the data obtained and store the results in computer memory. The authors emphasize the necessity to properly document digital images obtained during forensic-histological studies and propose the procedure for the formulation of electronic documents in conformity with the relevant technical and legal requirements. It is concluded that the use of digital images as a new study object permits to obviate the drawbacks inherent in the work with the traditional preparations and pass from descriptive microscopy to their quantitative analysis.

  10. Golden proportion for maxillofacial surgery in Orientals.

    PubMed

    Kawakami, S; Tsukada, S; Hayashi, H; Takada, Y; Koubayashi, S

    1989-11-01

    The facial position and balance of eyes, nose, and mouth in typical Japanese individuals were investigated, based on the golden proportion for each of these relationships. We found that Japanese tend to have a longer upper lip and shorter chin length compared with Caucasians. We believe that this tendency represents a general facial characteristic of the Oriental population. Each ratio obtained from determinations by our method was used for preoperative and postoperative aesthetic analysis in maxillofacial surgery. This method is considered useful because it permitted us to understand quantitatively the positional relationship and the balance of eyes, nose, and mouth in the face and to make comparisons with typical subjects.

  11. EasyFRAP-web: a web-based tool for the analysis of fluorescence recovery after photobleaching data.

    PubMed

    Koulouras, Grigorios; Panagopoulos, Andreas; Rapsomaniki, Maria A; Giakoumakis, Nickolaos N; Taraviras, Stavros; Lygerou, Zoi

    2018-06-13

    Understanding protein dynamics is crucial in order to elucidate protein function and interactions. Advances in modern microscopy facilitate the exploration of the mobility of fluorescently tagged proteins within living cells. Fluorescence recovery after photobleaching (FRAP) is an increasingly popular functional live-cell imaging technique which enables the study of the dynamic properties of proteins at a single-cell level. As an increasing number of labs generate FRAP datasets, there is a need for fast, interactive and user-friendly applications that analyze the resulting data. Here we present easyFRAP-web, a web application that simplifies the qualitative and quantitative analysis of FRAP datasets. EasyFRAP-web permits quick analysis of FRAP datasets through an intuitive web interface with interconnected analysis steps (experimental data assessment, different types of normalization and estimation of curve-derived quantitative parameters). In addition, easyFRAP-web provides dynamic and interactive data visualization and data and figure export for further analysis after every step. We test easyFRAP-web by analyzing FRAP datasets capturing the mobility of the cell cycle regulator Cdt2 in the presence and absence of DNA damage in cultured cells. We show that easyFRAP-web yields results consistent with previous studies and highlights cell-to-cell heterogeneity in the estimated kinetic parameters. EasyFRAP-web is platform-independent and is freely accessible at: https://easyfrap.vmnet.upatras.gr/.

  12. Fast analysis of wood preservers using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Uhl, A.; Loebe, K.; Kreuchwig, L.

    2001-06-01

    Laser-induced breakdown spectroscopy (LIBS) is used for the investigation of wood preservers in timber and in furniture. Both experiments in laboratory and practical applications in recycling facilities and on a building site prove the new possibilities for the fast detection of harmful agents in wood. A commercial system was developed for mobile laser-plasma-analysis as well as for industrial use in sorting plants. The universal measuring principle in combination with an Echelle optics permits real simultaneous multi-element-analysis in the range of 200-780 nm with a resolution of a few picometers. It enables the user to detect main and trace elements in wood within a few seconds, nearly independent of the matrix, knowing that different kinds of wood show an equal elemental composition. Sample preparation is not required. The quantitative analysis of inorganic wood preservers (containing, e.g. Cu, Cr, B, As, Pb, Hg) has been performed exactly using carbon as reference element. It can be shown that the detection limits for heavy metals in wood are in the ppm-range. Additional information is given concerning the quantitative analysis. Statistical data, e.g. the standard deviation (S.D.), were determined and calibration curves were used for each particular element. A comparison between ICP-AES and LIBS is given using depth profile correction factors regarding the different penetration depths with respect to the different volumes in wood analyzed by both analytical methods.

  13. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  14. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  15. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  16. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  17. 21 CFR 314.126 - Adequate and well-controlled studies.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... permits a valid comparison with a control to provide a quantitative assessment of drug effect. The... data analyses performed. (c) The Director of the Center for Drug Evaluation and Research may, on the...

  18. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue

    PubMed Central

    Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath

    2009-01-01

    Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697

  19. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  20. Multidimensional analysis of the frequencies and rates of cytokine secretion from single cells by quantitative microengraving.

    PubMed

    Han, Qing; Bradshaw, Elizabeth M; Nilsson, Björn; Hafler, David A; Love, J Christopher

    2010-06-07

    The large diversity of cells that comprise the human immune system requires methods that can resolve the individual contributions of specific subsets to an immunological response. Microengraving is process that uses a dense, elastomeric array of microwells to generate microarrays of proteins secreted from large numbers of individual live cells (approximately 10(4)-10(5) cells/assay). In this paper, we describe an approach based on this technology to quantify the rates of secretion from single immune cells. Numerical simulations of the microengraving process indicated an operating regime between 30 min-4 h that permits quantitative analysis of the rates of secretion. Through experimental validation, we demonstrate that microengraving can provide quantitative measurements of both the frequencies and the distribution in rates of secretion for up to four cytokines simultaneously released from individual viable primary immune cells. The experimental limits of detection ranged from 0.5 to 4 molecules/s for IL-6, IL-17, IFNgamma, IL-2, and TNFalpha. These multidimensional measures resolve the number and intensities of responses by cells exposed to stimuli with greater sensitivity than single-parameter assays for cytokine release. We show that cells from different donors exhibit distinct responses based on both the frequency and magnitude of cytokine secretion when stimulated under different activating conditions. Primary T cells with specific profiles of secretion can also be recovered after microengraving for subsequent expansion in vitro. These examples demonstrate the utility of quantitative, multidimensional profiles of single cells for analyzing the diversity and dynamics of immune responses in vitro and for identifying rare cells from clinical samples.

  1. An improved design of electrodes for measurement of streaming potentials on wet bone in vitro and in vivo.

    PubMed

    Cochran, G V; Dell, D G; Palmieri, V R; Johnson, M W; Otter, M W; Kadaba, M P

    1989-01-01

    Streaming potentials are generated by mechanical stress in wet bone and may constitute a control mechanism for bone remodeling. Measurement of streaming potentials in bone has attracted considerable effort in past years but quantitative studies have been hampered by relatively poor repeatability when using Ag.AgCl electrodes which contact bone via a wick moistened with electrolyte. Improvement now has been achieved with an electrode design that limits the specific area of contact of an agar/salt bridge by means of a silastic seal, thus permitting the same equipotential surface to be contacted for each set of measurements. This reduces variations caused by bone structure and impedance, and facilitates quantitative comparisons of the response of bone samples to selected variables. The new design also permits considerable qualitative improvement in recordings made from bone during locomotor function in experimental animals in vivo.

  2. Imaging spectroscopy of solar radio burst fine structures.

    PubMed

    Kontar, E P; Yu, S; Kuznetsov, A A; Emslie, A G; Alcock, B; Jeffrey, N L S; Melnik, V N; Bian, N H; Subramanian, P

    2017-11-15

    Solar radio observations provide a unique diagnostic of the outer solar atmosphere. However, the inhomogeneous turbulent corona strongly affects the propagation of the emitted radio waves, so decoupling the intrinsic properties of the emitting source from the effects of radio wave propagation has long been a major challenge in solar physics. Here we report quantitative spatial and frequency characterization of solar radio burst fine structures observed with the Low Frequency Array, an instrument with high-time resolution that also permits imaging at scales much shorter than those corresponding to radio wave propagation in the corona. The observations demonstrate that radio wave propagation effects, and not the properties of the intrinsic emission source, dominate the observed spatial characteristics of radio burst images. These results permit more accurate estimates of source brightness temperatures, and open opportunities for quantitative study of the mechanisms that create the turbulent coronal medium through which the emitted radiation propagates.

  3. From ClinicalTrials.gov trial registry to an analysis-ready database of clinical trial results.

    PubMed

    Cepeda, M Soledad; Lobanov, Victor; Berlin, Jesse A

    2013-04-01

    The ClinicalTrials.gov web site provides a convenient interface to look up study results, but it does not allow downloading data in a format that can be readily used for quantitative analyses. To develop a system that automatically downloads study results from ClinicalTrials.gov and provides an interface to retrieve study results in a spreadsheet format ready for analysis. Sherlock(®) identifies studies by intervention, population, or outcome of interest and in seconds creates an analytic database of study results ready for analyses. The outcome classification algorithms used in Sherlock were validated against a classification by an expert. Having a database ready for analysis that can be updated automatically, dramatically extends the utility of the ClinicalTrials.gov trial registry. It increases the speed of comparative research, reduces the need for manual extraction of data, and permits answering a vast array of questions.

  4. Detecting Autophagy and Autophagy Flux in Chronic Myeloid Leukemia Cells Using a Cyto-ID Fluorescence Spectrophotometric Assay.

    PubMed

    Guo, Sujuan; Pridham, Kevin J; Sheng, Zhi

    2016-01-01

    Autophagy is a catabolic process whereby cellular components are degraded to fuel cells for longer survival during stress. Hence, autophagy plays a vital role in determining cell fate and is central for homeostasis and pathogenesis of many human diseases including chronic myeloid leukemia (CML). It has been well established that autophagy is important for the leukemogenesis as well as drug resistance in CML. Thus, autophagy is an intriguing therapeutic target. However, current approaches that detect autophagy lack reliability and often fail to provide quantitative measurements. To overcome this hurdle and facilitate the development of autophagy-related therapies, we have recently developed an autophagy assay termed as the Cyto-ID fluorescence spectrophotometric assay. This method uses a cationic fluorescence dye, Cyto-ID, which specifically labels autophagic compartments and is detected by a spectrophotometer to permit a large-scale and quantitative analysis. As such, it allows rapid, reliable, and quantitative detection of autophagy and estimation of autophagy flux. In this chapter, we further provide technical details of this method and step-by-step protocols for measuring autophagy or autophagy flux in CML cell lines as well as primary hematopoietic cells.

  5. Dissection and Downstream Analysis of Zebra Finch Embryos at Early Stages of Development

    PubMed Central

    Murray, Jessica R.; Stanciauskas, Monika E.; Aralere, Tejas S.; Saha, Margaret S.

    2014-01-01

    The zebra finch (Taeniopygiaguttata) has become an increasingly important model organism in many areas of research including toxicology1,2, behavior3, and memory and learning4,5,6. As the only songbird with a sequenced genome, the zebra finch has great potential for use in developmental studies; however, the early stages of zebra finch development have not been well studied. Lack of research in zebra finch development can be attributed to the difficulty of dissecting the small egg and embryo. The following dissection method minimizes embryonic tissue damage, which allows for investigation of morphology and gene expression at all stages of embryonic development. This permits both bright field and fluorescence quality imaging of embryos, use in molecular procedures such as in situ hybridization (ISH), cell proliferation assays, and RNA extraction for quantitative assays such as quantitative real-time PCR (qtRT-PCR). This technique allows investigators to study early stages of development that were previously difficult to access. PMID:24999108

  6. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  7. Fluorometric enzymatic assay of L-arginine

    NASA Astrophysics Data System (ADS)

    Stasyuk, Nataliya; Gayda, Galina; Yepremyan, Hasmik; Stepien, Agnieszka; Gonchar, Mykhailo

    2017-01-01

    The enzymes of L-arginine (further - Arg) metabolism are promising tools for elaboration of selective methods for quantitative Arg analysis. In our study we propose an enzymatic method for Arg assay based on fluorometric monitoring of ammonia, a final product of Arg splitting by human liver arginase I (further - arginase), isolated from the recombinant yeast strain, and commercial urease. The selective analysis of ammonia (at 415 nm under excitation at 360 nm) is based on reaction with o-phthalaldehyde (OPA) in the presence of sulfite in alkali medium: these conditions permit to avoid the reaction of OPA with any amino acid. A linearity range of the fluorometric arginase-urease-OPA method is from 100 nM to 6 μМ with a limit of detection of 34 nM Arg. The method was used for the quantitative determination of Arg in the pooled sample of blood serum. The obtained results proved to be in a good correlation with the reference enzymatic method and literature data. The proposed arginase-urease-OPA method being sensitive, economical, selective and suitable for both routine and micro-volume formats, can be used in clinical diagnostics for the simultaneous determination of Arg as well as urea and ammonia in serum samples.

  8. Molecular and agronomic analysis of intraspecific variability in Capsicum baccatum var. pendulum accessions.

    PubMed

    Leite, P S S; Rodrigues, R; Silva, R N O; Pimenta, S; Medeiros, A M; Bento, C S; Gonçalves, L S A

    2016-10-05

    Capsicum baccatum is one of the most important chili peppers in South America, since this region is considered to be the center of origin and diversity of this species. In Brazil, C. baccatum has been widely explored by family farmers and there are different local names for each fruit phenotype, such as cambuci and dedo-de-moça (lady's finger). Although very popular among farmers and consumers, C. baccatum has been less extensively studied than other Capsicum species. This study describes the phenotypic and genotypic variability in C. baccatum var. pendulum accessions. Twenty-nine accessions from the Universidade Estadual do Norte Fluminense Darcy Ribeiro gene bank, and one commercial genotype ('BRS-Mari') were evaluated for 53 morphoagronomic descriptors (31 qualitative and 22 quantitative traits). In addition, accessions were genotyped using 30 microsatellite primers. Three accessions from the C. annuum complex were included in the molecular characterization. Nine of 31 qualitative descriptors were monomorphic, while all quantitative descriptors were highly significant different between accessions (P < 0.01). Using the unweighted pair group method using arithmetic averages, four groups were obtained based on multicategoric variables and five groups were obtained based on quantitative variables. In the genotyping analysis, 12 polymorphic simple sequence repeat primers amplified in C. baccatum with dissimilarity between accessions ranging from 0.13 to 0.91, permitting the formation of two distinct groups for Bayesian analysis. These results indicate wide variability among the accessions comparing phenotypic and genotypic data and revealed distinct patterns of dissimilarity between matrices, indicating that both steps are valuable for the characterization of C. baccatum var. pendulum accessions.

  9. NEW 3D TECHNIQUES FOR RANKING AND PRIORITIZATION OF CHEMICAL INVENTORIES

    EPA Science Inventory

    New three-dimensional quantitative structure activity (3-D QSAR) techniques for prioritizing chemical inventories for endocrine activity will be presented. The Common Reactivity Pattern (COREPA) approach permits identification of common steric and/or electronic patterns associate...

  10. Growth and characterization of binary and pseudo-binary 3-5 compounds exhibiting non-linear optical behavior. Undergraduate research opportunities in microgravity science and technology

    NASA Technical Reports Server (NTRS)

    Witt, August F.

    1992-01-01

    In line with the specified objectives, a Bridgman-type growth configuration in which unavoidable end effects - conventionally leading to growth interface relocation - are compensated by commensurate input-power changes is developed; the growth rate on a microscale is predictable and unaffected by changes in heat transfer conditions. To permit quantitative characterization of the growth furnace cavity (hot-zone), a 3-D thermal field mapping technique, based on the thermal image, is being tested for temperatures up to 1100 C. Computational NIR absorption analysis was modified to now permit characterization of semi-insulating single crystals. Work on growth and characterization of bismuth-silicate was initiated. Growth of BSO (B12SiO20) for seed material by the Czochralski technique is currently in progress. Undergraduate research currently in progress includes: ground based measurements of the wetting behavior (contact angles) of semiconductor melts on substrates consisting of potential confinement materials for solidification experiments in a reduced gravity environment. Hardware modifications required for execution of the wetting experiments in a KC-135 facility are developed.

  11. Soil resources and potential for agricultural development in Bahr El Jebel in southern Sudan, Jonglei Canal project area

    NASA Technical Reports Server (NTRS)

    Myers, V. I.; Moore, D. G.; Abdel-Hady, M. A.; Abdel-Samie, A. G.; Elshazly, E. M. (Principal Investigator); Youvis, H.; Worcester, B. K.; Klingebiel, A. A.; Elshazly, M. M.; Hamad, M. A.

    1978-01-01

    The author has identified the following significant results. Fourteen LANDSAT scenes were used to produce mosaics of the 167, 474 sq km study area. These were black and white MSS 7 images and false color composite images. Five major soil-landscape units were delineated on the mosaics, and these were subdivided into a total of 40 soil mapping units. Aerial reconnaissance was useful in defining boundaries between mapping units and in estimating the proportion of the various soils which composed each mapping unit. Ground surveying permitted first-hand observation of major soils and sampling for quantitative laboratory analysis. Soil interpretations were made, including properties, potentials, and limitations.

  12. Relating residue in raccoon feces to food consumed

    USGS Publications Warehouse

    Greenwood, R.J.

    1979-01-01

    Feeding tests were conducted with captive raccoons (Procyon lotor) to permit more meaningful interpretation of food habit data obtained from fecal analysis. Ten diverse types of natural foods were offered in 20 tests. Digestibility coefficients were calculated that ranged from 3.6 for dry sunflowers, where considerable residue was recovered, to infinity for earthworms and boned meat where no residue was recovered. The influence of differences in both food and animal behavior on digestibility coefficients was significant (ANOVA, F<0.001). The use of digestibility coefficients to adjust quantitative estimates of fecal residue or to predict biomass consumed is of questionable value with raccoons due to variability in foods consumed and behavior of individual animals.

  13. Innovative Applications of Laser Scanning and Rapid Prototype Printing to Rock Breakdown Experiments

    NASA Technical Reports Server (NTRS)

    Bourke, Mary; Viles, Heather; Nicoll, Joe; Lyew-Ayee, Parris; Ghent, Rebecca; Holmlund, James

    2008-01-01

    We present the novel application of two technologies for use in rock breakdown experiments, i.e. close-range, ground-based 3D triangulation scanning and rapid prototype printing. These techniques aid analyses of form-process interactions across the range of scales relevant to breakdown (micron-m). This is achieved through (a) the creation of DEMs (which permit quantitative description and analysis of rock surface morphology and morphological change) and (b) the production of more realistically-shaped experimental blocks. We illustrate the use of these techniques, alongside appropriate data analysis routines, in experiments designed to investigate the persistence of fluvially-derived features in the face of subsequent wind abrasion and weathering. These techniques have a range of potential applications in experimental field and lab-based geomorphic studies beyond those specifically outlined here.

  14. Computerized Doppler Tomography and Spectrum Analysis of Carotid Artery Flow

    PubMed Central

    Morton, Paul; Goldman, Dave; Nichols, W. Kirt

    1981-01-01

    Contrast angiography remains the definitive study in the evaluation of atherosclerotic occlusive vascular disease. However, a safer technique for serial screening of symptomatic patients and for routine follow up is necessary. Computerized pulsed Doppler ultrasonic arteriography is a noninvasive technique developed by Miles6 for imaging lateral, antero-posterior and transverse sections of the carotid artery. We [ill] this system with new software and hardware to analyze the three-dimensional blood flow data. The system now provides information about the location of the occlusive process in the artery and a semi-quantitative evaluation of the degree of obstruction. In addition, we interfaced a digital signal analyzer to the system which permits spectrum analysis of the pulsed Doppler signal. This addition has allowed us to identify lesions which are not yet hemodynamically significant. ImagesFig. 2bFig. 2c

  15. Noble Gas Temperature Proxy for Climate Change

    EPA Science Inventory

    Noble gases in groundwater appear to offer a practical approach for quantitatively determining past surface air temperatures over recharge areas for any watershed. The noble gas temperature (NGT) proxy should then permit a paleothermometry of a region over time. This terrestria...

  16. PREDICTING SUBSURFACE CONTAMINANT TRANSPORT AND TRANSFORMATION: CONSIDERATIONS FOR MODEL SELECTION AND FIELD VALIDATION

    EPA Science Inventory

    Predicting subsurface contaminant transport and transformation requires mathematical models based on a variety of physical, chemical, and biological processes. The mathematical model is an attempt to quantitatively describe observed processes in order to permit systematic forecas...

  17. Human Spaceflight Architecture Model (HSFAM) Data Dictionary

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2016-01-01

    HSFAM is a data model based on the DoDAF 2.02 data model with some for purpose extensions. These extensions are designed to permit quantitative analyses regarding stakeholder concerns about technical feasibility, configuration and interface issues, and budgetary and/or economic viability.

  18. MATERIALS SUPPORTING THE NEW RECREATIONAL WATER QUALITY CRITERIA FOR PATHOGENS

    EPA Science Inventory

    EPA is developing new, rapid methods for monitoring water quality at beaches to determine adequacy of water quality for swimming. The methods being developed rely upon quantitive polymerase chain reaction technology. They will permit real time decisions regarding beach closures...

  19. Clinical Pedodontics: An Approach Based on Comprehensive Care.

    ERIC Educational Resources Information Center

    And Others; Bennett, Carroll G.

    1981-01-01

    The University of Florida uses a comprehensive care system to teach clinical pedodontics. Several block clinics permit further experience with children. Details of the program are described, and quantitative results of patient treatment are compared with those of other clinical pedodontics programs. (MSE)

  20. Hypothesis exploration with visualization of variance

    PubMed Central

    2014-01-01

    Background The Consortium for Neuropsychiatric Phenomics (CNP) at UCLA was an investigation into the biological bases of traits such as memory and response inhibition phenotypes—to explore whether they are linked to syndromes including ADHD, Bipolar disorder, and Schizophrenia. An aim of the consortium was in moving from traditional categorical approaches for psychiatric syndromes towards more quantitative approaches based on large-scale analysis of the space of human variation. It represented an application of phenomics—wide-scale, systematic study of phenotypes—to neuropsychiatry research. Results This paper reports on a system for exploration of hypotheses in data obtained from the LA2K, LA3C, and LA5C studies in CNP. ViVA is a system for exploratory data analysis using novel mathematical models and methods for visualization of variance. An example of these methods is called VISOVA, a combination of visualization and analysis of variance, with the flavor of exploration associated with ANOVA in biomedical hypothesis generation. It permits visual identification of phenotype profiles—patterns of values across phenotypes—that characterize groups. Visualization enables screening and refinement of hypotheses about variance structure of sets of phenotypes. Conclusions The ViVA system was designed for exploration of neuropsychiatric hypotheses by interdisciplinary teams. Automated visualization in ViVA supports ‘natural selection’ on a pool of hypotheses, and permits deeper understanding of the statistical architecture of the data. Large-scale perspective of this kind could lead to better neuropsychiatric diagnostics. PMID:25097666

  1. Guidance for Product Category Rule Development, Version 1.0

    EPA Science Inventory

    Environmental claims based on life cycle assessment (LCA) can provide quantitative, full life cycle information on products in a format that can permit comparisons and thereby inform purchasing decisions. In recent years, a number of standards and guides have emerged for making b...

  2. 7 CFR 1767.15 - General instructions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... by such detailed information as will permit ready identification, analysis, and verification of all... utility's records shall be so kept as to permit ready analysis by prescribed accounts (by direct reference to sources of original entry to the extent practicable) and to permit preparation of financial and...

  3. 7 CFR 1767.15 - General instructions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... by such detailed information as will permit ready identification, analysis, and verification of all... utility's records shall be so kept as to permit ready analysis by prescribed accounts (by direct reference to sources of original entry to the extent practicable) and to permit preparation of financial and...

  4. 7 CFR 1767.15 - General instructions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... by such detailed information as will permit ready identification, analysis, and verification of all... utility's records shall be so kept as to permit ready analysis by prescribed accounts (by direct reference to sources of original entry to the extent practicable) and to permit preparation of financial and...

  5. Isolation of Circulating Plasma Cells in Multiple Myeloma Using CD138 Antibody-Based Capture in a Microfluidic Device

    NASA Astrophysics Data System (ADS)

    Qasaimeh, Mohammad A.; Wu, Yichao C.; Bose, Suman; Menachery, Anoop; Talluri, Srikanth; Gonzalez, Gabriel; Fulciniti, Mariateresa; Karp, Jeffrey M.; Prabhala, Rao H.; Karnik, Rohit

    2017-04-01

    The necessity for bone marrow aspiration and the lack of highly sensitive assays to detect residual disease present challenges for effective management of multiple myeloma (MM), a plasma cell cancer. We show that a microfluidic cell capture based on CD138 antigen, which is highly expressed on plasma cells, permits quantitation of rare circulating plasma cells (CPCs) in blood and subsequent fluorescence-based assays. The microfluidic device is based on a herringbone channel design, and exhibits an estimated cell capture efficiency of ~40-70%, permitting detection of <10 CPCs/mL using 1-mL sample volumes, which is difficult using existing techniques. In bone marrow samples, the microfluidic-based plasma cell counts exhibited excellent correlation with flow cytometry analysis. In peripheral blood samples, the device detected a baseline of 2-5 CD138+ cells/mL in healthy donor blood, with significantly higher numbers in blood samples of MM patients in remission (20-24 CD138+ cells/mL), and yet higher numbers in MM patients exhibiting disease (45-184 CD138+ cells/mL). Analysis of CPCs isolated using the device was consistent with serum immunoglobulin assays that are commonly used in MM diagnostics. These results indicate the potential of CD138-based microfluidic CPC capture as a useful ‘liquid biopsy’ that may complement or partially replace bone marrow aspiration.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, W. James; Albertson, R Craig; Jacob, Rick E.

    Here we present a re-description of Abudefduf luridus and reassign it to the genus Similiparma. We supplement traditional diagnoses and descriptions of this species with quantitative anatomical data collected from a family-wide geometric morphometric analysis of head morphology (44 species representing all 30 damselfish genera) and data from cranial micro-CT scans of fishes in the genus Similiparma. The use of geometric morphometric analyses (and other methods of shape analysis) permits detailed comparisons between the morphology of specific taxa and the anatomical diversity that has arisen in an entire lineage. This provides a particularly useful supplement to traditional description methods andmore » we recommend the use of such techniques by systematists. Similiparma and its close relatives constitute a branch of the damselfish phylogenetic tree that predominantly inhabits rocky reefs in the Atlantic and Eastern Pacific, as opposed to the more commonly studied damselfishes that constitute a large portion of the ichthyofauna on all coral-reef communities.« less

  7. Automatic structured grid generation using Gridgen (some restrictions apply)

    NASA Technical Reports Server (NTRS)

    Chawner, John R.; Steinbrenner, John P.

    1995-01-01

    The authors have noticed in the recent grid generation literature an emphasis on the automation of structured grid generation. The motivation behind such work is clear; grid generation is easily the most despised task in the grid-analyze-visualize triad of computational analysis (CA). However, because grid generation is closely coupled to both the design and analysis software and because quantitative measures of grid quality are lacking, 'push button' grid generation usually results in a compromise between speed, control, and quality. Overt emphasis on automation obscures the substantive issues of providing users with flexible tools for generating and modifying high quality grids in a design environment. In support of this paper's tongue-in-cheek title, many features of the Gridgen software are described. Gridgen is by no stretch of the imagination an automatic grid generator. Despite this fact, the code does utilize many automation techniques that permit interesting regenerative features.

  8. Experimental and Numerical Analysis of Microstructures and Stress States of Shot-Peened GH4169 Superalloys

    NASA Astrophysics Data System (ADS)

    Hu, Dianyin; Gao, Ye; Meng, Fanchao; Song, Jun; Wang, Rongqiao

    2018-04-01

    Combining experiments and finite element analysis (FEA), a systematic study was performed to analyze the microstructural evolution and stress states of shot-peened GH4169 superalloy over a variety of peening intensities and coverages. A dislocation density evolution model was integrated into the representative volume FEA model to quantitatively predict microstructural evolution in the surface layers and compared with experimental results. It was found that surface roughness and through-depth residual stress profile are more sensitive to shot-peening intensity compared to coverage due to the high kinetic energy involved. Moreover, a surface nanocrystallization layer was discovered in the top surface region of GH4169 for all shot-peening conditions. However, the grain refinement was more intensified under high shot-peening coverage, under which enough time was permitted for grain refinement. The grain size gradient predicted by the numerical framework showed good agreement with experimental observations.

  9. Body Composition.

    ERIC Educational Resources Information Center

    Mayhew, Jerry L.

    1981-01-01

    Body composition refers to the types and amounts of tissues which make up the body. The most acceptable method for assessing body composition is underwater weighing. A subcutaneous skinfold provides a quantitative measurement of fat below the skin. The skinfold technique permits a valid estimate of the body's total fat content. (JN)

  10. Utilities of gossip across organizational levels : Multilevel selection, free-riders, and teams.

    PubMed

    Kniffin, Kevin M; Wilson, David Sloan

    2005-09-01

    Gossip is a subject that has been studied by researchers from an array of disciplines with various foci and methods. We measured the content of language use by members of a competitive sports team across 18 months, integrating qualitative ethnographic methods with quantitative sampling and analysis. We hypothesized that the use of gossip will vary significantly depending on whether it is used for self-serving or group-serving purposes. Our results support a model of gossip derived from multilevel selection theory that expects gossip to serve group-beneficial rules when rewards are partitioned at the group level on a scale that permits mutual monitoring. We integrate our case study with earlier studies of gossip conducted by anthropologists, psychologists, and management researchers.

  11. Ultraviolet absorption: Experiment MA-059. [measurement of atmospheric species concentrations

    NASA Technical Reports Server (NTRS)

    Donahue, T. M.; Hudson, R. D.; Rawlins, W. T.; Anderson, J.; Kaufman, F.; Mcelroy, M. B.

    1977-01-01

    A technique devised to permit the measurement of atmospheric species concentrations is described. This technique involves the application of atomic absorption spectroscopy and the quantitative observation of resonance fluorescence in which atomic or molecular species scatter resonance radiation from a light source into a detector. A beam of atomic oxygen and atomic nitrogen resonance radiation, strong unabsorbable oxygen and nitrogen radiation, and visual radiation was sent from Apollo to Soyuz. The density of atomic oxygen and atomic nitrogen between the two spacecraft was measured by observing the amount of resonance radiation absorbed when the line joining Apollo and Soyuz was perpendicular to their velocity with respect to the ambient atmosphere. Results of postflight analysis of the resonance fluorescence data are discussed.

  12. [Methods for Reducing Laser Speckles to Achieve Even Illumination of the Microscope Field of View in Biophysical Studies].

    PubMed

    Barsky, V E; Lysov, Yu P; Yegorov, E E; Yurasov, D A; Mamaev, D D; Yurasov, R A; Cherepanov, A V; Chudinov, A V; Smoldovskaya, O V; Arefieva, A S; Rubina, A Yu; Zasedatelev, A S

    2015-01-01

    The aim of this work was to compare different speckle reduction techniques. It was shown that the use of devices based on liquid crystals only leads to partial reduction of speckle contrast. In quantitative luminescent microscopy an application of the mechanical devices when a laser beam is spread within the field of view turned out to be more efficient. Laser speckle noise was virtually eliminated with the developed and manufactured mechanical device comprising a fiber optic ring light guide and the vibrator that permits movement of optical fiber ends towards the laser diode during measurements. The method developed for the analysis of microarrays was successfully applied to the problem of speckle reduction.

  13. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    NASA Astrophysics Data System (ADS)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  14. An Integrative Platform for Three-dimensional Quantitative Analysis of Spatially Heterogeneous Metastasis Landscapes

    NASA Astrophysics Data System (ADS)

    Guldner, Ian H.; Yang, Lin; Cowdrick, Kyle R.; Wang, Qingfei; Alvarez Barrios, Wendy V.; Zellmer, Victoria R.; Zhang, Yizhe; Host, Misha; Liu, Fang; Chen, Danny Z.; Zhang, Siyuan

    2016-04-01

    Metastatic microenvironments are spatially and compositionally heterogeneous. This seemingly stochastic heterogeneity provides researchers great challenges in elucidating factors that determine metastatic outgrowth. Herein, we develop and implement an integrative platform that will enable researchers to obtain novel insights from intricate metastatic landscapes. Our two-segment platform begins with whole tissue clearing, staining, and imaging to globally delineate metastatic landscape heterogeneity with spatial and molecular resolution. The second segment of our platform applies our custom-developed SMART 3D (Spatial filtering-based background removal and Multi-chAnnel forest classifiers-based 3D ReconsTruction), a multi-faceted image analysis pipeline, permitting quantitative interrogation of functional implications of heterogeneous metastatic landscape constituents, from subcellular features to multicellular structures, within our large three-dimensional (3D) image datasets. Coupling whole tissue imaging of brain metastasis animal models with SMART 3D, we demonstrate the capability of our integrative pipeline to reveal and quantify volumetric and spatial aspects of brain metastasis landscapes, including diverse tumor morphology, heterogeneous proliferative indices, metastasis-associated astrogliosis, and vasculature spatial distribution. Collectively, our study demonstrates the utility of our novel integrative platform to reveal and quantify the global spatial and volumetric characteristics of the 3D metastatic landscape with unparalleled accuracy, opening new opportunities for unbiased investigation of novel biological phenomena in situ.

  15. Quantitative analysis of three-dimensional biological cells using interferometric microscopy

    NASA Astrophysics Data System (ADS)

    Shaked, Natan T.; Wax, Adam

    2011-06-01

    Live biological cells are three-dimensional microscopic objects that constantly adjust their sizes, shapes and other biophysical features. Wide-field digital interferometry (WFDI) is a holographic technique that is able to record the complex wavefront of the light which has interacted with in-vitro cells in a single camera exposure, where no exogenous contrast agents are required. However, simple quasi-three-dimensional holographic visualization of the cell phase profiles need not be the end of the process. Quantitative analysis should permit extraction of numerical parameters which are useful for cytology or medical diagnosis. Using a transmission-mode setup, the phase profile represents the multiplication between the integral refractive index and the thickness of the sample. These coupled variables may not be distinct when acquiring the phase profiles of dynamic cells. Many morphological parameters which are useful for cell biologists are based on the cell thickness profile rather than on its phase profile. We first overview methods to decouple the cell thickness and its refractive index using the WFDI-based phase profile. Then, we present a whole-cell-imaging approach which is able to extract useful numerical parameters on the cells even in cases where decoupling of cell thickness and refractive index is not possible or desired.

  16. Analysis of doxorubicin distribution in MCF-7 cells treated with drug-loaded nanoparticles by combination of two fluorescence-based techniques, confocal spectral imaging and capillary electrophoresis.

    PubMed

    Gautier, Juliette; Munnier, Emilie; Soucé, Martin; Chourpa, Igor; Douziech Eyrolles, Laurence

    2015-05-01

    The intracellular distribution of the antiancer drug doxorubicin (DOX) was followed qualitatively by fluorescence confocal spectral imaging (FCSI) and quantitatively by capillary electrophoresis (CE). FCSI permits the localization of the major fluorescent species in cell compartments, with spectral shifts indicating the polarity of the respective environment. However, distinction between drug and metabolites by FCSI is difficult due to their similar fluorochromes, and direct quantification of their fluorescence is complicated by quantum yield variation between different subcellular environments. On the other hand, capillary electrophoresis with fluorescence detection (CE-LIF) is a quantitative method capable of separating doxorubicin and its metabolites. In this paper, we propose a method for determining drug and metabolite concentration in enriched nuclear and cytosolic fractions of cancer cells by CE-LIF, and we compare these data with those of FCSI. Significant differences in the subcellular distribution of DOX are observed between the drug administered as a molecular solution or as a suspension of drug-loaded iron oxide nanoparticles coated with polyethylene glycol. Comparative analysis of the CE-LIF vs FCSI data may lead to a tentative calibration of this latter method in terms of DOX fluorescence quantum yields in the nucleus and more or less polar regions of the cytosol.

  17. Anthropometric Measurements Usage in Medical Sciences

    PubMed Central

    Utkualp, Nevin; Ercan, Ilker

    2015-01-01

    Morphometry is introduced as quantitative approach to seek information concerning variations and changes in the forms of organisms that described the relationship between the human body and disease. Scientists of all civilization, who existed until today, examined the human body using anthropometric methods. For these reasons, anthropometric data are used in many contexts to screen for or monitor disease. Anthropometry, a branch of morphometry, is the study of the size and shape of the components of biological forms and their variations in populations. Morphometrics can also be defined as the quantitative analysis of biological forms. The field has developed rapidly over the last two decades to the extent that we now distinguish between traditional morphometrics and the more recent geometric morphometrics. Advances in imaging technology have resulted in the protection of a greater amount of morphological information and have permitted the analysis of this information. The oldest and most commonly used of these methods is radiography. With developments in this area, CT and MRI have also been started to be used in screening of the internal organs. Morphometric measurements that are used in medicine, are widely used in the diagnosis and the follow-up and the treatment of the disease, today. In addition, in cosmetology use of these new measurements is increasing every day. PMID:26413519

  18. Quantitative sampling of conformational heterogeneity of a DNA hairpin using molecular dynamics simulations and ultrafast fluorescence spectroscopy

    PubMed Central

    Voltz, Karine; Léonard, Jérémie; Touceda, Patricia Tourón; Conyard, Jamie; Chaker, Ziyad; Dejaegere, Annick; Godet, Julien; Mély, Yves; Haacke, Stefan; Stote, Roland H.

    2016-01-01

    Molecular dynamics (MD) simulations and time resolved fluorescence (TRF) spectroscopy were combined to quantitatively describe the conformational landscape of the DNA primary binding sequence (PBS) of the HIV-1 genome, a short hairpin targeted by retroviral nucleocapsid proteins implicated in the viral reverse transcription. Three 2-aminopurine (2AP) labeled PBS constructs were studied. For each variant, the complete distribution of fluorescence lifetimes covering 5 orders of magnitude in timescale was measured and the populations of conformers experimentally observed to undergo static quenching were quantified. A binary quantification permitted the comparison of populations from experimental lifetime amplitudes to populations of aromatically stacked 2AP conformers obtained from simulation. Both populations agreed well, supporting the general assumption that quenching of 2AP fluorescence results from pi-stacking interactions with neighboring nucleobases and demonstrating the success of the proposed methodology for the combined analysis of TRF and MD data. Cluster analysis of the latter further identified predominant conformations that were consistent with the fluorescence decay times and amplitudes, providing a structure-based rationalization for the wide range of fluorescence lifetimes. Finally, the simulations provided evidence of local structural perturbations induced by 2AP. The approach presented is a general tool to investigate fine structural heterogeneity in nucleic acid and nucleoprotein assemblies. PMID:26896800

  19. A droplet-merging platform for comparative functional analysis of m1 and m2 macrophages in response to e. coli-induced stimuli.

    PubMed

    Hondroulis, Evangelia; Movila, Alexandru; Sabhachandani, Pooja; Sarkar, Saheli; Cohen, Noa; Kawai, Toshihisa; Konry, Tania

    2017-03-01

    Microfluidic droplets are used to isolate cell pairs and prevent crosstalk with neighboring cells, while permitting free motility and interaction within the confined space. Dynamic analysis of cellular heterogeneity in droplets has provided insights in various biological processes. Droplet manipulation methods such as fusion and fission make it possible to precisely regulate the localized environment of a cell in a droplet and deliver reagents as required. Droplet fusion strategies achieved by passive mechanisms preserve cell viability and are easier to fabricate and operate. Here, we present a simple and effective method for the co-encapsulation of polarized M1 and M2 macrophages with Escherichia coli (E. coli) by passive merging in an integrated droplet generation, merging, and docking platform. This approach facilitated live cell profiling of effector immune functions in situ and quantitative functional analysis of macrophage heterogeneity. Biotechnol. Bioeng. 2017;114: 705-709. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Analysis of Protein Kinetics Using Fluorescence Recovery After Photobleaching (FRAP).

    PubMed

    Giakoumakis, Nickolaos Nikiforos; Rapsomaniki, Maria Anna; Lygerou, Zoi

    2017-01-01

    Fluorescence recovery after photobleaching (FRAP) is a cutting-edge live-cell functional imaging technique that enables the exploration of protein dynamics in individual cells and thus permits the elucidation of protein mobility, function, and interactions at a single-cell level. During a typical FRAP experiment, fluorescent molecules in a defined region of interest within the cell are bleached by a short and powerful laser pulse, while the recovery of the fluorescence in the region is monitored over time by time-lapse microscopy. FRAP experimental setup and image acquisition involve a number of steps that need to be carefully executed to avoid technical artifacts. Equally important is the subsequent computational analysis of FRAP raw data, to derive quantitative information on protein diffusion and binding parameters. Here we present an integrated in vivo and in silico protocol for the analysis of protein kinetics using FRAP. We focus on the most commonly encountered challenges and technical or computational pitfalls and their troubleshooting so that valid and robust insight into protein dynamics within living cells is gained.

  1. FURTHER ANALYSIS OF SUBTYPES OF AUTOMATICALLY REINFORCED SIB: A REPLICATION AND QUANTITATIVE ANALYSIS OF PUBLISHED DATASETS

    PubMed Central

    Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.; Bonner, Andrew C.; Arevalo, Alexander R.

    2017-01-01

    Hagopian, Rooker, and Zarcone (2015) evaluated a model for subtyping automatically reinforced self-injurious behavior (SIB) based on its sensitivity to changes in functional analysis conditions and the presence of self-restraint. The current study tested the generality of the model by applying it to all datasets of automatically reinforced SIB published from 1982 to 2015. We identified 49 datasets that included sufficient data to permit subtyping. Similar to the original study, Subtype-1 SIB was generally amenable to treatment using reinforcement alone, whereas Subtype-2 SIB was not. Conclusions could not be drawn about Subtype-3 SIB due to the small number of datasets. Nevertheless, the findings support the generality of the model and suggest that sensitivity of SIB to disruption by alternative reinforcement is an important dimension of automatically reinforced SIB. Findings also suggest that automatically reinforced SIB should no longer be considered a single category and that additional research is needed to better understand and treat Subtype-2 SIB. PMID:28032344

  2. Quantitative weaknesses of the Marcus-Hush theory of electrode kinetics revealed by Reverse Scan Square Wave Voltammetry: The reduction of 2-methyl-2-nitropropane at mercury microelectrodes

    NASA Astrophysics Data System (ADS)

    Laborda, Eduardo; Wang, Yijun; Henstridge, Martin C.; Martínez-Ortiz, Francisco; Molina, Angela; Compton, Richard G.

    2011-08-01

    The Marcus-Hush and Butler-Volmer kinetic electrode models are compared experimentally by studying the reduction of 2-methyl-2-nitropropane in acetonitrile at mercury microelectrodes using Reverse Scan Square Wave Voltammetry. This technique is found to be very sensitive to the electrode kinetics and to permit critical comparison of the two models. The Butler-Volmer model satisfactorily fits the experimental data whereas Marcus-Hush does not quantitatively describe this redox system.

  3. Shame, Guilt, and Depressive Symptoms: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Kim, Sangmoon; Thibodeau, Ryan; Jorgensen, Randall S.

    2011-01-01

    Recent theoretical and empirical work has facilitated the drawing of sharp conceptual distinctions between shame and guilt. A clear view of these distinctions has permitted development of a research literature aimed at evaluating the differential associations of shame and guilt with depressive symptoms. This study quantitatively summarized the…

  4. 76 FR 36934 - Endangered Species; Marine Mammals; Receipt of Applications for Permit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... bears (Ursus maritimus) by adjusting the video camera equipment and conducting aerial surveys using FLIR (forward looking infrared) and ground-truth surveys with snowmobiles near dens for the purpose of...

  5. Mapping urban revitalization: using GIS spatial analysis to evaluate a new housing policy.

    PubMed

    Perkins, Douglas D; Larsen, Courtney; Brown, Barbara B

    2009-01-01

    This longitudinal, multimethod study uses geographical information system (GIS) software to evaluate the community-wide impact of a neighborhood revitalization project. Unsystematic visual examination and analysis of GIS maps are offered as a complementary tool to quantitative analysis and one that is much more compelling, meaningful, and effective in presentation to community and nonscientific professional audiences. The centerpiece of the intervention was the development of a new, middle-class housing subdivision in an area that was declining physically and economically. This represents three major urban/housing policy directions: (1) the emphasis on home ownership for working-class families, (2) the deconcentration of poverty through development of mixed-income neighborhoods, and (3) the clean up and redevelopment of contaminated, former industrial brownfields. Resident survey responses, objective environmental assessment observations, and building permit data were collected, geocoded at the address level, and aggregated to the block level on 60 street blocks in the older neighborhoods surrounding the new housing in two waves: during site clearing and housing construction (Time 1: 1993-95) and three years post-completion (Time 2: 1998-99). Variables mapped include (a) Time 1-2 change in self-reported home repairs and improvements, (b) change in the assessed physical condition of yards and exteriors of 925 individual residential properties, (c) change in residents' home pride, and (d) a city archive of building permits at Time 2. Physical conditions improved overall in the neighborhood, but spatial analysis of the maps suggest that the spillover effects, if any, of the new housing were geographically limited and included unintended negative psychological consequences. Results argue for greater use of GIS and the street block level in community research and of psychological and behavioral variables in planning research and decisions.

  6. The evaluation of different sorbents for the preconcentration of phenoxyacetic acid herbicides and their metabolites from soils.

    PubMed

    Moret, Sònia; Sánchez, Juan M; Salvadó, Victòria; Hidalgo, Manuela

    2005-12-16

    A procedure using alkaline extraction, solid-phase extraction (SPE) and HPLC is developed to analyze the polar herbicides 2,4-dichlorophenoxyacetic acid (2,4-D) and 4-chloro-2-methylphenoxyacetic acid (MCPA) together with their main metabolites in soils. An ion-pairing HPLC method is used for the determination as it permits the baseline separation of these highly polar herbicides and their main metabolites. The use of a highly cross-linked polystyrene-divinylbenzene sorbent (PS-DVB) gives the best results for the analysis of these compounds. This sorbent allows the direct preconcentration of the analytes at the high pH values obtained after quantitative alkaline extraction of the herbicides from soil samples. Different parameters are evaluated for the SPE preconcentration step. The high polarity of the main analytes of interest (2,4-D and MCPA) makes it necessary to work at low flow rates (< or =0.5 mL min(-1)) in order for these compounds to be retained by the PS-DVB sorbent. A two stage desorption from the SPE sorbent is required to obtain the analytes in solvents that are appropriate for HPLC determination. A first desorption with a 50:50 methanol:water mixture elutes the most polar analytes (2,4-D, MCPA and 2CP). The second elution step with methanol permits the analysis of the other phenol derivatives. The humic and fulvic substances present in the soil are not efficiently retained by PS-DVB sorbents at alkaline pH's and so do not interfere in the analysis. This method has been successfully applied in the analysis of soil samples from a golf course treated with a commercial product containing esters of 2,4-D and MCPA as the active components.

  7. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Boundary element analysis of corrosion problems for pumps and pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyasaka, M.; Amaya, K.; Kishimoto, K.

    1995-12-31

    Three-dimensional (3D) and axi-symmetric boundary element methods (BEM) were developed to quantitatively estimate cathodic protection and macro-cell corrosion. For 3D analysis, a multiple-region method (MRM) was developed in addition to a single-region method (SRM). The validity and usefulness of the BEMs were demonstrated by comparing numerical results with experimental data from galvanic corrosion systems of a cylindrical model and a seawater pipe, and from a cathodic protection system of an actual seawater pump. It was shown that a highly accurate analysis could be performed for fluid machines handling seawater with complex 3D fields (e.g. seawater pump) by taking account ofmore » flow rate and time dependencies of polarization curve. Compared to the 3D BEM, the axi-symmetric BEM permitted large reductions in numbers of elements and nodes, which greatly simplified analysis of axi-symmetric fields such as pipes. Computational accuracy and CPU time were compared between analyses using two approximation methods for polarization curves: a logarithmic-approximation method and a linear-approximation method.« less

  9. Accurate single-shot quantitative phase imaging of biological specimens with telecentric digital holographic microscopy.

    PubMed

    Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge

    2014-04-01

    The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.

  10. Precise Quantitation of MicroRNA in a Single Cell with Droplet Digital PCR Based on Ligation Reaction.

    PubMed

    Tian, Hui; Sun, Yuanyuan; Liu, Chenghui; Duan, Xinrui; Tang, Wei; Li, Zhengping

    2016-12-06

    MicroRNA (miRNA) analysis in a single cell is extremely important because it allows deep understanding of the exact correlation between the miRNAs and cell functions. Herein, we wish to report a highly sensitive and precisely quantitative assay for miRNA detection based on ligation-based droplet digital polymerase chain reaction (ddPCR), which permits the quantitation of miRNA in a single cell. In this ligation-based ddPCR assay, two target-specific oligonucleotide probes can be simply designed to be complementary to the half-sequence of the target miRNA, respectively, which avoids the sophisticated design of reverse transcription and provides high specificity to discriminate a single-base difference among miRNAs with simple operations. After the miRNA-templated ligation, the ddPCR partitions individual ligated products into a water-in-oil droplet and digitally counts the fluorescence-positive and negative droplets after PCR amplification for quantification of the target molecules, which possesses the power of precise quantitation and robustness to variation in PCR efficiency. By integrating the advantages of the precise quantification of ddPCR and the simplicity of the ligation-based PCR, the proposed method can sensitively measure let-7a miRNA with a detection limit of 20 aM (12 copies per microliter), and even a single-base difference can be discriminated in let-7 family members. More importantly, due to its high selectivity and sensitivity, the proposed method can achieve precise quantitation of miRNAs in single-cell lysate. Therefore, the ligation-based ddPCR assay may serve as a useful tool to exactly reveal the miRNAs' actions in a single cell, which is of great importance for the study of miRNAs' biofunction as well as for the related biomedical studies.

  11. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  12. 21 CFR 860.7 - Determination of safety and effectiveness.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... diagnosis with a control in such a fashion as to permit quantitative evaluation. The precise nature of the... Devices and Radiological Health, the Center for Biologics Evaluation and Research, or the Center for Drug Evaluation and Research, as applicable, need not be resubmitted, but may be incorporated by reference. [43 FR...

  13. 21 CFR 860.7 - Determination of safety and effectiveness.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... diagnosis with a control in such a fashion as to permit quantitative evaluation. The precise nature of the... Devices and Radiological Health, the Center for Biologics Evaluation and Research, or the Center for Drug Evaluation and Research, as applicable, need not be resubmitted, but may be incorporated by reference. [43 FR...

  14. 21 CFR 860.7 - Determination of safety and effectiveness.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... diagnosis with a control in such a fashion as to permit quantitative evaluation. The precise nature of the... Devices and Radiological Health, the Center for Biologics Evaluation and Research, or the Center for Drug Evaluation and Research, as applicable, need not be resubmitted, but may be incorporated by reference. [43 FR...

  15. 21 CFR 860.7 - Determination of safety and effectiveness.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... diagnosis with a control in such a fashion as to permit quantitative evaluation. The precise nature of the... Devices and Radiological Health, the Center for Biologics Evaluation and Research, or the Center for Drug Evaluation and Research, as applicable, need not be resubmitted, but may be incorporated by reference. [43 FR...

  16. 21 CFR 860.7 - Determination of safety and effectiveness.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... diagnosis with a control in such a fashion as to permit quantitative evaluation. The precise nature of the... Devices and Radiological Health, the Center for Biologics Evaluation and Research, or the Center for Drug Evaluation and Research, as applicable, need not be resubmitted, but may be incorporated by reference. [43 FR...

  17. Quantitation & Case-Study-Driven Inquiry to Enhance Yeast Fermentation Studies

    ERIC Educational Resources Information Center

    Grammer, Robert T.

    2012-01-01

    We propose a procedure for the assay of fermentation in yeast in microcentrifuge tubes that is simple and rapid, permitting assay replicates, descriptive statistics, and the preparation of line graphs that indicate reproducibility. Using regression and simple derivatives to determine initial velocities, we suggest methods to compare the effects of…

  18. Puffed cereals with added chamomile - quantitative analysis of polyphenols and optimization of their extraction method.

    PubMed

    Blicharski, Tomasz; Oniszczuk, Anna; Olech, Marta; Oniszczuk, Tomasz; Wójtowicz, Agnieszka; Krawczyk, Wojciech; Nowak, Renata

    2017-05-11

    [b]Abstract Introduction[/b]. Functional food plays an important role in the prevention, management and treatment of chronic diseases. One of the most interesting techniques of functional food production is extrusion-cooking. Functional foods may include such items as puffed cereals, breads and beverages that are fortified with vitamins, some nutraceuticals and herbs. Due to its pharmacological activity, chamomile flowers are the most popular components added to functional food. Quantitative analysis of polyphenolic antioxidants, as well as comparison of various methods for the extraction of phenolic compounds from corn puffed cereals, puffed cereals with an addition of chamomile (3, 5, 10 and 20%) and from [i]Chamomillae anthodium. [/i] [b]Materials and Methods[/b]. Two modern extraction methods - ultrasound assisted extraction (UAE) at 40 °C and 60 °C, as well as accelerated solvent extraction (ASE) at 100 °C and 120 °C were used for the isolation of polyphenols from functional food. Analysis of flavonoids and phenolic acids was carried out using reversed-phase high-performance liquid chromatography and electrospray ionization mass spectrometry (LC-ESI-MS/MS). [b]Results and Conclusions[/b]. For most of the analyzed compounds, the highest yields were obtained by ultrasound assisted extraction. The highest temperature during the ultrasonification process (60 °C) increased the efficiency of extraction, without degradation of polyphenols. UAE easily arrives at extraction equilibrium and therefore permits shorter periods of time, reducing the energy input. Furthermore, UAE meets the requirements of 'Green Chemistry'.

  19. Rapid and sensitive analysis of multiple bioactive constituents in tripterygium glycosides tablets using liquid chromatography coupled with time-of-flight mass spectrometry.

    PubMed

    Su, Meng-xiang; Zhou, Wen-di; Lan, Juan; Di, Bin; Hang, Tai-jun

    2015-03-01

    A simultaneous determination method based on liquid chromatography coupled with time-of-flight mass spectrometry was developed for the analysis of 11 bioactive constituents in tripterygium glycosides tablets, an immune and inflammatory prescription used in China. The analysis was fully optimized on a 1.8 μm particle size C18 column with linear gradient elution, permitting good separation of the 11 analytes and two internal standards in 21 min. The quantitation of each target constituent was carried out using the narrow window extracted ion chromatograms with a ±l0 ppm extraction window, yielding good linearity (r(2) > 0.996) with a linear range of 10-1000 ng/mL. The limits of quantitation were low ranging from 0.25 to 5.02 ng/mL for the 11 analytes, and the precisions and repeatability were better than 1.6 and 5.3%, respectively. The acceptable recoveries obtained were in the range of 93.4-107.4%. This proposed method was successfully applied to quantify the 11 bioactive constituents in commercial samples produced by nine pharmaceutical manufacturers to profile the quality of these preparations. The overall results demonstrate that the contents of the 11 bioactive constituents in different samples were in great diversity, therefore, the quality, clinical safety, and efficacy of this drug needs further research and evaluation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Adipose tissue MRI for quantitative measurement of central obesity.

    PubMed

    Poonawalla, Aziz H; Sjoberg, Brett P; Rehm, Jennifer L; Hernando, Diego; Hines, Catherine D; Irarrazaval, Pablo; Reeder, Scott B

    2013-03-01

    To validate adipose tissue magnetic resonance imaging (atMRI) for rapid, quantitative volumetry of visceral adipose tissue (VAT) and total adipose tissue (TAT). Data were acquired on normal adults and clinically overweight girls with Institutional Review Board (IRB) approval/parental consent using sagittal 6-echo 3D-spoiled gradient-echo (SPGR) (26-sec single-breath-hold) at 3T. Fat-fraction images were reconstructed with quantitative corrections, permitting measurement of a physiologically based fat-fraction threshold in normals to identify adipose tissue, for automated measurement of TAT, and semiautomated measurement of VAT. TAT accuracy was validated using oil phantoms and in vivo TAT/VAT measurements validated with manual segmentation. Group comparisons were performed between normals and overweight girls using TAT, VAT, VAT-TAT-ratio (VTR), body-mass-index (BMI), waist circumference, and waist-hip-ratio (WHR). Oil phantom measurements were highly accurate (<3% error). The measured adipose fat-fraction threshold was 96% ± 2%. VAT and TAT correlated strongly with manual segmentation (normals r(2) ≥ 0.96, overweight girls r(2) ≥ 0.99). VAT segmentation required 30 ± 11 minutes/subject (14 ± 5 sec/slice) using atMRI, versus 216 ± 73 minutes/subject (99 ± 31 sec/slice) manually. Group discrimination was significant using WHR (P < 0.001) and VTR (P = 0.004). The atMRI technique permits rapid, accurate measurements of TAT, VAT, and VTR. Copyright © 2012 Wiley Periodicals, Inc.

  1. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  2. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  3. Quantitative fibronectin to help decision-making in women with symptoms of preterm labour (QUIDS) part 1: Individual participant data meta-analysis and health economic analysis.

    PubMed

    Stock, Sarah J; Wotherspoon, Lisa M; Boyd, Kathleen A; Morris, Rachel K; Dorling, Jon; Jackson, Lesley; Chandiramani, Manju; David, Anna L; Khalil, Asma; Shennan, Andrew; Hodgetts Morton, Victoria; Lavender, Tina; Khan, Khalid; Harper-Clarke, Susan; Mol, Ben W; Riley, Richard D; Norrie, John; Norman, Jane E

    2018-04-07

    The aim of the QUIDS study is to develop a decision support tool for the management of women with symptoms and signs of preterm labour, based on a validated prognostic model using quantitative fetal fibronectin (qfFN) concentration, in combination with clinical risk factors. The study will evaluate the Rapid fFN 10Q System (Hologic, Marlborough, Massachusetts) which quantifies fFN in a vaginal swab. In part 1 of the study, we will develop and internally validate a prognostic model using an individual participant data (IPD) meta-analysis of existing studies containing women with symptoms of preterm labour alongside fFN measurements and pregnancy outcome. An economic analysis will be undertaken to assess potential cost-effectiveness of the qfFN prognostic model. The primary endpoint will be the ability of the prognostic model to rule out spontaneous preterm birth within 7 days. Six eligible studies were identified by systematic review of the literature and five agreed to provide their IPD (n=5 studies, 1783 women and 139 events of preterm delivery within 7 days of testing). The study is funded by the National Institute of Healthcare Research Health Technology Assessment (HTA 14/32/01). It has been approved by the West of Scotland Research Ethics Committee (16/WS/0068). CRD42015027590. Protocol version 2, date 1 November 2016. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Analysis of sesquiterpenes in Valeriana officinalis by capillary electrophoresis.

    PubMed

    Mikell, J R; Ganzera, M; Khan, I A

    2001-12-01

    A capillary electrophoresis (CE) method permitting the determination of the main sesquiterpenes in Valeriana officinalis has been developed. A separation of valerenic acid and its hydroxy and acetoxy derivatives, three compounds characteristic for the species, was achieved using a 40 mM phosphate-borate buffer at pH 8.5, which contained 10% isopropanol as organic modifier. Applied temperature and voltage were 35 degrees C and 17.5 kV, respectively. This setup allowed a baseline separation of the three compounds within 8 min, with a detection limit of 5.8 micrograms/ml or less. Out of six market products analyzed, only one contained a detectable amount of the marker compounds, with 0.54% of hydroxyvalerenic acid and 0.13% valerenic acid, respectively. The quantitative results were comparable to those obtained by HPLC.

  5. Miniaturized matrix solid-phase dispersion followed by liquid chromatography-tandem mass spectrometry for the quantification of synthetic dyes in cosmetics and foodstuffs used or consumed by children.

    PubMed

    Guerra, Eugenia; Llompart, Maria; Garcia-Jares, Carmen

    2017-12-22

    Miniaturized matrix solid-phase dispersion (MSPD) followed by liquid chromatography tandem mass spectrometry (LC-MS/MS) has been proposed for the simultaneous analysis of different classes of synthetic dyes in confectionery and cosmetics intended for or mostly consumed by children. Selected compounds include most of the permitted dyes as food additives as well as some of the most frequently used to color cosmetic products in accordance with the respective European directives. MSPD procedure was optimized by means of experimental design, allowing an effective, rapid and simple extraction of dyes with low sample and reagents consumption (0.1g of sample and 2mL of elution solvent). LC-MS/MS was optimized for good resolution, selectivity and sensitivity using a low ionic strength mobile phase (3mM NH 4 Ac-methanol). Method performance was demonstrated in real samples showing good linearity (R≥0.9928) and intra- and inter-day precision (%RSD≤15%). Method LODs were ≤0.952μgg -1 and ≤0.476μgg -1 for confectionery and cosmetic samples, respectively. Recoveries of compounds from nine different matrices were quantitative. The validated method was successfully applied to 24 commercial samples (14 cosmetics and 10 foods) in which 9 of the selected dyes were found at concentrations up to 989μgg -1 , exceeding in some cases the regulated maximum permitted limits. A non-permitted dye, Acid Orange 7, was found in one candy. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Quantitative measurements of magnetic vortices using position resolved diffraction in Lorentz STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaluzec, N. J.

    2002-03-05

    A number of electron column techniques have been developed over the last forty years to permit visualization of magnetic fields in specimens. These include: Fresnel imaging, Differential Phase Contrast, Electron Holography and Lorentz STEM. In this work we have extended the LSTEM methodology using Position Resolved Diffraction (PRD) to quantitatively measure the in-plane electromagnetic fields of thin film materials. The experimental work reported herein has been carried out using the ANL AAEM HB603Z 300 kV FEG instrument 5. In this instrument, the electron optical column was operated in a zero field mode, at the specimen, where the objective lens ismore » turned off and the probe forming lens functions were reallocated to the C1, C2, and C3 lenses. Post specimen lenses (P1, P2, P3, P4) were used to magnify the transmitted electrons to a YAG screen, which was then optically transferred to a Hamamatsu ORCA ER CCD array. This CCD was interfaced to an EmiSpec Data Acquisition System and the data was subsequently transferred to an external computer system for detailed quantitative analysis. In Position Resolved Diffraction mode, we digitally step a focused electron probe across the region of interest of the specimen while at the same time recording the complete diffraction pattern at each point in the scan.« less

  7. WASTE ANALYSIS PLAN REVIEW ADVISOR - AN INTELLIGENT DATABASE TO ASSIST RCRA PERMIT REVIEWERS

    EPA Science Inventory

    The Waste Analysis Plan Review Advisor (WAPRA) system assists in the review of the Waste Analysis Plan Section of RCRA Part B facility permit applications. Specifically, this program automates two functions of the waste analysis plan review. First, the system checks all wastes wh...

  8. Data Science Innovations That Streamline Development, Documentation, Reproducibility, and Dissemination of Models in Computational Thermodynamics: An Application of Image Processing Techniques for Rapid Computation, Parameterization and Modeling of Phase Diagrams

    NASA Astrophysics Data System (ADS)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) represents a collection of numerical techniques that are used to calculate quantitative results from thermodynamic theory. In the Earth sciences, CT is most often applied to estimate the equilibrium properties of solutions, to calculate phase equilibria from models of the thermodynamic properties of materials, and to approximate irreversible reaction pathways by modeling these as a series of local equilibrium steps. The thermodynamic models that underlie CT calculations relate the energy of a phase to temperature, pressure and composition. These relationships are not intuitive and they are seldom well constrained by experimental data; often, intuition must be applied to generate a robust model that satisfies the expectations of use. As a consequence of this situation, the models and databases the support CT applications in geochemistry and petrology are tedious to maintain as new data and observations arise. What is required to make the process more streamlined and responsive is a computational framework that permits the rapid generation of observable outcomes from the underlying data/model collections, and importantly, the ability to update and re-parameterize the constitutive models through direct manipulation of those outcomes. CT procedures that take models/data to the experiential reference frame of phase equilibria involve function minimization, gradient evaluation, the calculation of implicit lines, curves and surfaces, contour extraction, and other related geometrical measures. All these procedures are the mainstay of image processing analysis. Since the commercial escalation of video game technology, open source image processing libraries have emerged (e.g., VTK) that permit real time manipulation and analysis of images. These tools find immediate application to CT calculations of phase equilibria by permitting rapid calculation and real time feedback between model outcome and the underlying model parameters.

  9. Understanding and scaffolding Danish schoolteachers' motivation for using classroom-based physical activity: study protocol for a mixed methods study.

    PubMed

    Knudsen, Louise Stjerne; Skovgaard, Thomas; Bredahl, Thomas

    2018-03-14

    The benefits of physical activity for children's health, both mental and physical, and its positive effects on academic achievement are well established. Research also emphasises that schools could provide a natural setting for regular physical activity. There is, however, a limited amount of knowledge about teachers' views when it comes to integrating physical activity as part of teaching. The aim of this study is to understand teachers' motivation for integrating physical activity as part of teaching and to assess their need for guidance and support. The study uses an explanatory sequential mixed-methods design. Schools from across Denmark are included in the sample. The design comprises two separated phases-a quantitative and qualitative phase. The quantitative phase is guided by the self-determination theory where teachers' motivation will be measured using the Work Task Motivation Scale for Teachers. The theory of scaffolding guides the qualitative phase, which consists of in-depth interviews with participants selected from the quantitative phase based on levels of motivation and on demographic information. In accordance with the study aims, the analysis of data will identify teachers' internal and external levels of motivation. The purpose of the qualitative phase is to enhance understanding of teachers' motivation and of their need for support in the use of physical activity in teaching. All relevant ethics approvals have been acquired. All participants in this study will provide written informed consent prior to data collection. All data emerging from the quantitative and qualitative phase will be anonymised for analysis. Ethics approval was requested from the Regional Committee on Health Research Ethics for Southern Denmark approval ID S-20162000-40 and the Danish Data Protection Agency approval ID 16/15491). The study was deemed not notifiable by both authorities. NCT02894346; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Quantitative Imaging in Cancer Evolution and Ecology

    PubMed Central

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral Darwinian dynamics before and during therapy. Advances in image analysis will place clinical imaging in an increasingly central role in the development of evolution-based patient-specific cancer therapy. © RSNA, 2013 PMID:24062559

  11. Quantitative microbiome profiling links gut community variation to microbial load.

    PubMed

    Vandeputte, Doris; Kathagen, Gunter; D'hoe, Kevin; Vieira-Silva, Sara; Valles-Colomer, Mireia; Sabino, João; Wang, Jun; Tito, Raul Y; De Commer, Lindsey; Darzi, Youssef; Vermeire, Séverine; Falony, Gwen; Raes, Jeroen

    2017-11-23

    Current sequencing-based analyses of faecal microbiota quantify microbial taxa and metabolic pathways as fractions of the sample sequence library generated by each analysis. Although these relative approaches permit detection of disease-associated microbiome variation, they are limited in their ability to reveal the interplay between microbiota and host health. Comparative analyses of relative microbiome data cannot provide information about the extent or directionality of changes in taxa abundance or metabolic potential. If microbial load varies substantially between samples, relative profiling will hamper attempts to link microbiome features to quantitative data such as physiological parameters or metabolite concentrations. Saliently, relative approaches ignore the possibility that altered overall microbiota abundance itself could be a key identifier of a disease-associated ecosystem configuration. To enable genuine characterization of host-microbiota interactions, microbiome research must exchange ratios for counts. Here we build a workflow for the quantitative microbiome profiling of faecal material, through parallelization of amplicon sequencing and flow cytometric enumeration of microbial cells. We observe up to tenfold differences in the microbial loads of healthy individuals and relate this variation to enterotype differentiation. We show how microbial abundances underpin both microbiota variation between individuals and covariation with host phenotype. Quantitative profiling bypasses compositionality effects in the reconstruction of gut microbiota interaction networks and reveals that the taxonomic trade-off between Bacteroides and Prevotella is an artefact of relative microbiome analyses. Finally, we identify microbial load as a key driver of observed microbiota alterations in a cohort of patients with Crohn's disease, here associated with a low-cell-count Bacteroides enterotype (as defined through relative profiling).

  12. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  13. A pyrosequencing assay for the quantitative methylation analysis of the PCDHB gene cluster, the major factor in neuroblastoma methylator phenotype.

    PubMed

    Banelli, Barbara; Brigati, Claudio; Di Vinci, Angela; Casciano, Ida; Forlani, Alessandra; Borzì, Luana; Allemanni, Giorgio; Romani, Massimo

    2012-03-01

    Epigenetic alterations are hallmarks of cancer and powerful biomarkers, whose clinical utilization is made difficult by the absence of standardization and of common methods of data interpretation. The coordinate methylation of many loci in cancer is defined as 'CpG island methylator phenotype' (CIMP) and identifies clinically distinct groups of patients. In neuroblastoma (NB), CIMP is defined by a methylation signature, which includes different loci, but its predictive power on outcome is entirely recapitulated by the PCDHB cluster only. We have developed a robust and cost-effective pyrosequencing-based assay that could facilitate the clinical application of CIMP in NB. This assay permits the unbiased simultaneous amplification and sequencing of 17 out of 19 genes of the PCDHB cluster for quantitative methylation analysis, taking into account all the sequence variations. As some of these variations were at CpG doublets, we bypassed the data interpretation conducted by the methylation analysis software to assign the corrected methylation value at these sites. The final result of the assay is the mean methylation level of 17 gene fragments in the protocadherin B cluster (PCDHB) cluster. We have utilized this assay to compare the methylation levels of the PCDHB cluster between high-risk and very low-risk NB patients, confirming the predictive value of CIMP. Our results demonstrate that the pyrosequencing-based assay herein described is a powerful instrument for the analysis of this gene cluster that may simplify the data comparison between different laboratories and, in perspective, could facilitate its clinical application. Furthermore, our results demonstrate that, in principle, pyrosequencing can be efficiently utilized for the methylation analysis of gene clusters with high internal homologies.

  14. Quantitative Evaluation of Cisplatin Uptake in Sensitive and Resistant Individual Cells by Single-Cell ICP-MS (SC-ICP-MS).

    PubMed

    Corte Rodríguez, M; Álvarez-Fernández García, R; Blanco, E; Bettmer, J; Montes-Bayón, M

    2017-11-07

    One of the main limitations to the Pt-therapy in cancer is the development of associated drug resistance that can be associated with a significant reduction of the intracellular platinum concentration. Thus, intracellular Pt concentration could be considered as a biomarker of cisplatin resistance. In this work, an alternative method to address intracellular Pt concentration in individual cells is explored to permit the evaluation of different cell models and alternative therapies in a relatively fast way. For this aim, total Pt analysis in single cells has been implemented using a total consumption nebulizer coupled to inductively coupled plasma mass spectrometric detection (ICP-MS). The efficiency of the proposed device has been evaluated in combination with flow cytometry and turned out to be around 25% (cells entering the ICP-MS from the cells in suspension). Quantitative uptake studies of a nontoxic Tb-containing compound by individual cells were conducted and the results compared to those obtained by bulk analysis of the same cells. Both sets of data were statistically comparable. Thus, final application of the developed methodology to the comparative uptake of Pt-species in cisplatin resistant and sensitive cell lines (A2780cis and A2780) was conducted. The results obtained revealed the potential of this analytical strategy to differentiate between different cell lines of different sensitivity to the drug which might be of high medical interest.

  15. Quantitative sampling of conformational heterogeneity of a DNA hairpin using molecular dynamics simulations and ultrafast fluorescence spectroscopy.

    PubMed

    Voltz, Karine; Léonard, Jérémie; Touceda, Patricia Tourón; Conyard, Jamie; Chaker, Ziyad; Dejaegere, Annick; Godet, Julien; Mély, Yves; Haacke, Stefan; Stote, Roland H

    2016-04-20

    Molecular dynamics (MD) simulations and time resolved fluorescence (TRF) spectroscopy were combined to quantitatively describe the conformational landscape of the DNA primary binding sequence (PBS) of the HIV-1 genome, a short hairpin targeted by retroviral nucleocapsid proteins implicated in the viral reverse transcription. Three 2-aminopurine (2AP) labeled PBS constructs were studied. For each variant, the complete distribution of fluorescence lifetimes covering 5 orders of magnitude in timescale was measured and the populations of conformers experimentally observed to undergo static quenching were quantified. A binary quantification permitted the comparison of populations from experimental lifetime amplitudes to populations of aromatically stacked 2AP conformers obtained from simulation. Both populations agreed well, supporting the general assumption that quenching of 2AP fluorescence results from pi-stacking interactions with neighboring nucleobases and demonstrating the success of the proposed methodology for the combined analysis of TRF and MD data. Cluster analysis of the latter further identified predominant conformations that were consistent with the fluorescence decay times and amplitudes, providing a structure-based rationalization for the wide range of fluorescence lifetimes. Finally, the simulations provided evidence of local structural perturbations induced by 2AP. The approach presented is a general tool to investigate fine structural heterogeneity in nucleic acid and nucleoprotein assemblies. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. 78 FR 27421 - Endangered and Threatened Wildlife and Plants; Receipt of Application for Incidental Take Permit...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-10

    ... gopher tortoise. In advance of the progression of the mining operations into future phases, quantitative surveys will be conducted for the skinks and gopher tortoises to determine the occupancy and extent of occupancy within these suitable areas. The completion of these surveys will be subject to the guidelines at...

  17. Investigating Adult Language Input and Young Children's Responses in Naturalistic Environments: An Observational Framework

    ERIC Educational Resources Information Center

    Marinac, Julie V.; Woodyatt, Gail C.; Ozanne, Anne E.

    2008-01-01

    This paper reports the design and trial of an original Observational Framework for quantitative investigation of young children's responses to adult language in their typical language learning environments. The Framework permits recording of both the response expectation of the adult utterances, and the degree of compliance in the child's…

  18. SEPARATION OF THORIUM FROM RARE EARTHS WITH TANNIN (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlodavets, N.I.

    1959-03-01

    Thorium is quantitatively precipitated with tannin in a 0.005 N solution of nitric or hydrochloric acid. This permits its separation from trivalent rare earths, which are not precipitated with tannin in such relatively weak acid solutions. The accuracy of the determinations is as usual at the gravimetric determinations of macroquantities of elements. (auth)

  19. Teacher Perceptions of the Impact of the Data Team Process on Core Instructional Practices

    ERIC Educational Resources Information Center

    Schwanenberger, Michael; Ahearn, Christopher

    2013-01-01

    This paper documents the results of a mixed method study of teachers who participated in a survey and focus groups in a K-12 southwestern suburban school district during the 2011-2012 school year. The mixed method design contains elements of both qualitative and quantitative approaches, permitting the authors to collect qualitative and…

  20. Quantitation of stress echocardiography by tissue Doppler and strain rate imaging: a dream come true?

    PubMed

    Galderisi, Maurizio; Mele, Donato; Marino, Paolo Nicola

    2005-01-01

    Tissue Doppler (TD) is an ultrasound tool providing a quantitative agreement of left ventricular regional myocardial function in different modalities. Spectral pulsed wave (PW) TD, performed online during the examination, measures instantaneous myocardial velocities. By means of color TD, velocity images are digitally stored for subsequent off-line analysis and mean myocardial velocities are measured. An implementation of color TD includes strain rate imaging (SRI), based on post-processing conversion of regional velocities in local myocardial deformation rate (strain rate) and percent deformation (strain). These three modalities have been applied to stress echocardiography for quantitative evaluation of regional left ventricular function and detection of ischemia and viability. They present advantages and limitations. PWTD does not permit the simultaneous assessment of multiple walls and therefore is not compatible with clinical stress echocardiography while it could be used in a laboratory setting. Color TD provides a spatial map of velocity throughout the myocardium but its results are strongly affected by the frame rate. Both color TD and PWTD are also influenced by overall cardiac motion and tethering from adjacent segments and require reference velocity values for interpretation of regional left ventricular function. High frame rate (i.e. > 150 ms) post-processing-derived SRI can potentially overcome these limitations, since measurements of myocardial deformation have not any significant apex-to-base gradient. Preliminary studies have shown encouraging results about the ability of SRI to detect ischemia and viability, in terms of both strain rate changes and/or evidence of post-systolic thickening. SRI is, however, Doppler-dependent and time-consuming. Further technical refinements are needed to improve its application and introduce new ultrasound modalities to overcome the limitations of the Doppler-derived deformation analysis.

  1. Mapping differential interactomes by affinity purification coupled with data-independent mass spectrometry acquisition.

    PubMed

    Lambert, Jean-Philippe; Ivosev, Gordana; Couzens, Amber L; Larsen, Brett; Taipale, Mikko; Lin, Zhen-Yuan; Zhong, Quan; Lindquist, Susan; Vidal, Marc; Aebersold, Ruedi; Pawson, Tony; Bonner, Ron; Tate, Stephen; Gingras, Anne-Claude

    2013-12-01

    Characterizing changes in protein-protein interactions associated with sequence variants (e.g., disease-associated mutations or splice forms) or following exposure to drugs, growth factors or hormones is critical to understanding how protein complexes are built, localized and regulated. Affinity purification (AP) coupled with mass spectrometry permits the analysis of protein interactions under near-physiological conditions, yet monitoring interaction changes requires the development of a robust and sensitive quantitative approach, especially for large-scale studies in which cost and time are major considerations. We have coupled AP to data-independent mass spectrometric acquisition (sequential window acquisition of all theoretical spectra, SWATH) and implemented an automated data extraction and statistical analysis pipeline to score modulated interactions. We used AP-SWATH to characterize changes in protein-protein interactions imparted by the HSP90 inhibitor NVP-AUY922 or melanoma-associated mutations in the human kinase CDK4. We show that AP-SWATH is a robust label-free approach to characterize such changes and propose a scalable pipeline for systems biology studies.

  2. One-Cell Doubling Evaluation by Living Arrays of Yeast, ODELAY!

    DOE PAGES

    Herricks, Thurston; Dilworth, David J.; Mast, Fred D.; ...

    2016-11-16

    Cell growth is a complex phenotype widely used in systems biology to gauge the impact of genetic and environmental perturbations. Due to the magnitude of genome-wide studies, resolution is often sacrificed in favor of throughput, creating a demand for scalable, time-resolved, quantitative methods of growth assessment. We present ODELAY (One-cell Doubling Evaluation by Living Arrays of Yeast), an automated and scalable growth analysis platform. High measurement density and single-cell resolution provide a powerful tool for large-scale multiparameter growth analysis based on the modeling of microcolony expansion on solid media. Pioneered in yeast but applicable to other colony forming organisms, ODELAYmore » extracts the three key growth parameters (lag time, doubling time, and carrying capacity) that define microcolony expansion from single cells, simultaneously permitting the assessment of population heterogeneity. The utility of ODELAY is illustrated using yeast mutants, revealing a spectrum of phenotypes arising from single and combinatorial growth parameter perturbations.« less

  3. One-Cell Doubling Evaluation by Living Arrays of Yeast, ODELAY!

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herricks, Thurston; Dilworth, David J.; Mast, Fred D.

    Cell growth is a complex phenotype widely used in systems biology to gauge the impact of genetic and environmental perturbations. Due to the magnitude of genome-wide studies, resolution is often sacrificed in favor of throughput, creating a demand for scalable, time-resolved, quantitative methods of growth assessment. We present ODELAY (One-cell Doubling Evaluation by Living Arrays of Yeast), an automated and scalable growth analysis platform. High measurement density and single-cell resolution provide a powerful tool for large-scale multiparameter growth analysis based on the modeling of microcolony expansion on solid media. Pioneered in yeast but applicable to other colony forming organisms, ODELAYmore » extracts the three key growth parameters (lag time, doubling time, and carrying capacity) that define microcolony expansion from single cells, simultaneously permitting the assessment of population heterogeneity. The utility of ODELAY is illustrated using yeast mutants, revealing a spectrum of phenotypes arising from single and combinatorial growth parameter perturbations.« less

  4. [Methods of the multivariate statistical analysis of so-called polyetiological diseases using the example of coronary heart disease].

    PubMed

    Lifshits, A M

    1979-01-01

    General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.

  5. Evaluation of amino-acid racemization/epimerization dating using radiocarbon-dated fossil land snails

    NASA Astrophysics Data System (ADS)

    Goodfriend, Glenn A.

    1987-08-01

    The relation between age and amino-acid epimer ratios (alloisoleucine/isoleucine, A/I) of Holocene land snails was quantitatively evaluated through 14C and amino-acid analysis of 33 samples from fluvial and colluvial sediments and rodent middens in the Northern Negev Desert of Israel. A/I is strongly correlated with 14C ages in fluvial and rodent midden deposits (r = 0.95 and 0.94, respectively), permitting age estimates from A/I ratios with precisions of ±700 and ±660 yr. The correlation is weaker in colluvial deposits (r = 0.74), and age estimates from A/I ratios are correspondingly less precise (±1580 yr). This probably results from delayed burial, which exposes the shells to intense radiation on the desert surface. Because of the generally strong relation between age and A/I, amino-acid epimerization analysis of individual shells can be used to identify mixed-age deposits and to reconstruct species chronologies from mixed-age deposits.

  6. Systematic review and meta-analysis: tools for the information age.

    PubMed

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. A Method for Identification and Analysis of Non-Overlapping Myeloid Immunophenotypes in Humans

    PubMed Central

    Gustafson, Michael P.; Lin, Yi; Maas, Mary L.; Van Keulen, Virginia P.; Johnston, Patrick B.; Peikert, Tobias; Gastineau, Dennis A.; Dietz, Allan B.

    2015-01-01

    The development of flow cytometric biomarkers in human studies and clinical trials has been slowed by inconsistent sample processing, use of cell surface markers, and reporting of immunophenotypes. Additionally, the function(s) of distinct cell types as biomarkers cannot be accurately defined without the proper identification of homogeneous populations. As such, we developed a method for the identification and analysis of human leukocyte populations by the use of eight 10-color flow cytometric protocols in combination with novel software analyses. This method utilizes un-manipulated biological sample preparation that allows for the direct quantitation of leukocytes and non-overlapping immunophenotypes. We specifically designed myeloid protocols that enable us to define distinct phenotypes that include mature monocytes, granulocytes, circulating dendritic cells, immature myeloid cells, and myeloid derived suppressor cells (MDSCs). We also identified CD123 as an additional distinguishing marker for the phenotypic characterization of immature LIN-CD33+HLA-DR- MDSCs. Our approach permits the comprehensive analysis of all peripheral blood leukocytes and yields data that is highly amenable for standardization across inter-laboratory comparisons for human studies. PMID:25799053

  8. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  9. Quantitative interaction analysis permits molecular insights into functional NOX4 NADPH oxidase heterodimer assembly.

    PubMed

    O'Neill, Sharon; Mathis, Magalie; Kovačič, Lidija; Zhang, Suisheng; Reinhardt, Jürgen; Scholz, Dimitri; Schopfer, Ulrich; Bouhelal, Rochdi; Knaus, Ulla G

    2018-06-08

    Protein-protein interactions critically regulate many biological systems, but quantifying functional assembly of multipass membrane complexes in their native context is still challenging. Here, we combined modeling-assisted protein modification and information from human disease variants with a minimal-size fusion tag, split-luciferase-based approach to probe assembly of the NADPH oxidase 4 (NOX4)-p22 phox enzyme, an integral membrane complex with unresolved structure, which is required for electron transfer and generation of reactive oxygen species (ROS). Integrated analyses of heterodimerization, trafficking, and catalytic activity identified determinants for the NOX4-p22 phox interaction, such as heme incorporation into NOX4 and hot spot residues in transmembrane domains 1 and 4 in p22 phox Moreover, their effect on NOX4 maturation and ROS generation was analyzed. We propose that this reversible and quantitative protein-protein interaction technique with its small split-fragment approach will provide a protein engineering and discovery tool not only for NOX research, but also for other intricate membrane protein complexes, and may thereby facilitate new drug discovery strategies for managing NOX-associated diseases. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.

  10. Nitrate Waste Treatment Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil-Holterman, Luciana R.; Martinez, Patrick Thomas; Garcia, Terrence Kerwin

    2017-07-05

    This plan is designed to outline the collection and analysis of nitrate salt-bearing waste samples required by the New Mexico Environment Department- Hazardous Waste Bureau in the Los Alamos National Laboratory (LANL) Hazardous Waste Facility Permit (Permit).

  11. Mission-oriented requirements for updating MIL-H-8501: Calspan proposed structure and rationale

    NASA Technical Reports Server (NTRS)

    Chalk, C. R.; Radford, R. C.

    1985-01-01

    This report documents the effort by Arvin/Calspan Corporation to formulate a revision of MIL-H-8501A in terms of Mission-Oriented Flying Qualities Requirements for Military Rotorcraft. Emphasis is placed on development of a specification structure which will permit addressing Operational Missions and Flight Phases, Flight Regions, Classification of Required Operational Capability, Categorization of Flight Phases, and Levels of Flying Qualities. A number of definitions is established to permit addressing the rotorcraft state, flight envelopes, environments, and the conditions under which degraded flying qualities are permitted. Tentative requirements are drafted for Required Operational Capability Class 1. Also included is a Background Information and Users Guide for the draft specification structure proposed for the MIL-H-8501A revision. The report also contains a discussion of critical data gaps and attempts to prioritize these data gaps and to suggest experiments that should be performed to generate data needed to support formulation of quantitative design criteria for the additional Operational Capability Classes 2, 3, and 4.

  12. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  13. Army Response Letter & Analysis - signed January 19, 2001

    EPA Pesticide Factsheets

    A response to the letter, which requested a review of the proposed decision by the Army Corps of Engineers Baltimore District to issue four Department of the Army permits to Baltimore County (3 permits) and Anne Arundel County (1 permit), MD.

  14. NDE in aerospace-requirements for science, sensors and sense.

    PubMed

    Heyman, J S

    1989-01-01

    The complexity of modern NDE (nondestructive evaluation) arises from four main factors: quantitative measurement, science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform a design. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.

  15. Characterizing user requirements for future land observing satellites

    NASA Technical Reports Server (NTRS)

    Barker, J. L.; Cressy, P. J.; Schnetzler, C. C.; Salomonson, V. V.

    1981-01-01

    The objective procedure was developed for identifying probable sensor and mission characteristics for an operational satellite land observing system. Requirements were systematically compiled, quantified and scored by type of use, from surveys of federal, state, local and private communities. Incremental percent increases in expected value of data were estimated for critical system improvements. Comparisons with costs permitted selection of a probable sensor system, from a set of 11 options, with the following characteristics: 30 meter spatial resolution in 5 bands and 15 meters in 1 band, spectral bands nominally at Thematic Mapper (TM) bands 1 through 6 positions, and 2 day data turn around for receipt of imagery. Improvements are suggested for both the form of questions and the procedures for analysis of future surveys in order to provide a more quantitatively precise definition of sensor and mission requirements.

  16. Magnetic Propulsion of Microswimmers with DNA-Based Flagellar Bundles.

    PubMed

    Maier, Alexander M; Weig, Cornelius; Oswald, Peter; Frey, Erwin; Fischer, Peer; Liedl, Tim

    2016-02-10

    We show that DNA-based self-assembly can serve as a general and flexible tool to construct artificial flagella of several micrometers in length and only tens of nanometers in diameter. By attaching the DNA flagella to biocompatible magnetic microparticles, we provide a proof of concept demonstration of hybrid structures that, when rotated in an external magnetic field, propel by means of a flagellar bundle, similar to self-propelling peritrichous bacteria. Our theoretical analysis predicts that flagellar bundles that possess a length-dependent bending stiffness should exhibit a superior swimming speed compared to swimmers with a single appendage. The DNA self-assembly method permits the realization of these improved flagellar bundles in good agreement with our quantitative model. DNA flagella with well-controlled shape could fundamentally increase the functionality of fully biocompatible nanorobots and extend the scope and complexity of active materials.

  17. Use of keyword hierarchies to interpret gene expression patterns.

    PubMed

    Masys, D R; Welsh, J B; Lynn Fink, J; Gribskov, M; Klacansky, I; Corbeil, J

    2001-04-01

    High-density microarray technology permits the quantitative and simultaneous monitoring of thousands of genes. The interpretation challenge is to extract relevant information from this large amount of data. A growing variety of statistical analysis approaches are available to identify clusters of genes that share common expression characteristics, but provide no information regarding the biological similarities of genes within clusters. The published literature provides a potential source of information to assist in interpretation of clustering results. We describe a data mining method that uses indexing terms ('keywords') from the published literature linked to specific genes to present a view of the conceptual similarity of genes within a cluster or group of interest. The method takes advantage of the hierarchical nature of Medical Subject Headings used to index citations in the MEDLINE database, and the registry numbers applied to enzymes.

  18. NDE in aerospace - Requirements for science, sensors and sense

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1989-01-01

    The complexity of modern nondestructive evaluation (NDE) arises from four main factors: quantitative measurement science, physical models for computational analysis, realistic interfacing with engineering decisions, and direct access to management priorities. Recent advances in the four factors of NDE are addressed. Physical models of acoustic propagation are presented that have led to the development of measurement technologies advancing the ability to assure that materials and structures will perform as designed. In addition, a brief discussion is given of current research for future mission needs such as smart structures that sense their own health. Such advances permit projects to integrate design for inspection into their plans, bringing NDE into engineering and management priorities. The measurement focus is on ultrasonics with generous case examples. Problem solutions highlighted include critical stress in fasteners, residual stress in steel, NDE laminography, and solid rocket motor NDE.

  19. Characterisation and quantification of phenolic compounds of extra-virgin olive oils according to their geographical origin by a rapid and resolutive LC-ESI-TOF MS method.

    PubMed

    Ouni, Youssef; Taamalli, Ameni; Gómez-Caravaca, Ana Maria; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto; Zarrouk, Mokhtar

    2011-08-01

    The phenolic compounds present in seven samples of olive fruits were analysed by a rapid and resolutive LC-ESI-TOF MS method. All samples were collected during the normal picking period for olive oil production, in central and south Tunisia, and were obtained from the Oueslati variety cultivated in different olive growing areas. In the Tunisian samples, 22 compounds have been characterised by LC-ESI-TOF MS analysis. Results showed no qualitative differences in the phenolic fractions between virgin olive oils from different geographical region. However, significant quantitative differences were observed in a wide number of phenolic compounds. These results permit to use the phenolic fractions as an indicator of each region. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Systems biology and clinical cytomics: The 10th Leipziger Workshop and the 3rd International Workshop on Slide-Based Cytometry, Leipzig, Germany, April 2005.

    PubMed

    Tárnok, Attila; Valet, Günther K; Emmrich, Frank

    2006-01-01

    Despite very significant technical and software improvements in flow cytometry (FCM) since the 1980's, the demand for a cytometric technology combining both quantitative cell analysis and morphological documentation in Cytomics became evident. Improvements in microtechnology and computing permit nowadays similar quantitative and stoichiometric single cell-based high-throughput analyses by microscopic instruments, like Slide-Based Cytometry (SBC). SBC and related techniques offer unique tools to perform complex immunophenotyping, thereby enabling diagnostic procedures during early disease stages. Multicolor or polychromatic analysis of cells by SBC is of special importance not only as a cytomics technology platform but also because of low quantities of required reagents and biological material. The exact knowledge of the location of each cell on the slide permits repetitive restaining and reanalysis of specimens. Various separate measurements of the same specimen can be ultimately fused to one database increasing the information obtained per cell. Relocation and optical evaluation of cells as typical SBC feature, can be of integral importance for cytometric analysis, since artifacts can be excluded and morphology of measured cells can be documented. Progress in cell analytic: In the SBC, new horizons can be opened by the new techniques of structural and functional analysis with the high resolution from intracellular and membrane (confocal microscopy, nanoscopy, total internal fluorescence microscopy (TIRFM), and tissue level (tissomics), to organ and organism level (in vivo cytometry, optical whole body imaging). Predictive medicine aims at the detection of changes in patient's state prior to the manifestation of the disease or the complication. Such instances concern immune consequences of surgeries or noninfectious posttraumatic shock in intensive care patients or the pretherapeutic identification of high risk patients in cancer cytostatic therapy. Preventive anti-infectious or anti-shock therapy as well as curative chemotherapy in combination with stem cell transplantation may provide better survival chances for patient at concomitant cost containment. Predictive medicine-guided optimization of therapy could lead to individualized medicine that gives significant therapeutic effect and may lower or abrogate potential therapeutic side effects. The 10th Leipziger Workshop combined with the 3rd International Workshop on SBC aimed to offer new methods in Image- and Slide-Based Cytometry for solutions in clinical research. It moved towards practical applications in the clinics and the clinical laboratory. This development will be continued in 2006 at the upcoming Leipziger Workshop and the International Workshop on Slide-Based Cytometry.

  1. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2004-01-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative measures used in this study will in combination attempt to determine which texture characterization models best capture the correct statistical and radiometric attributes of the corresponding real image textures in both the spatial and spectral domains. The motivation for this work is to refine our understanding of the complexities of texture phenomena so that an optimal texture characterization model that can accurately account for these complexities can be eventually implemented into a synthetic image generation (SIG) model. Further, conclusions will be drawn regarding which of the candidate texture models are able to achieve realistic levels of spatial and spectral clutter, thereby permitting more effective and robust testing of hyperspectral algorithms in synthetic imagery.

  2. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2003-12-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative measures used in this study will in combination attempt to determine which texture characterization models best capture the correct statistical and radiometric attributes of the corresponding real image textures in both the spatial and spectral domains. The motivation for this work is to refine our understanding of the complexities of texture phenomena so that an optimal texture characterization model that can accurately account for these complexities can be eventually implemented into a synthetic image generation (SIG) model. Further, conclusions will be drawn regarding which of the candidate texture models are able to achieve realistic levels of spatial and spectral clutter, thereby permitting more effective and robust testing of hyperspectral algorithms in synthetic imagery.

  3. MATERIALS SUPPORTING THE NEW RECREATIONAL ...

    EPA Pesticide Factsheets

    EPA is developing new, rapid methods for monitoring water quality at beaches to determine adequacy of water quality for swimming. The methods being developed rely upon quantitive polymerase chain reaction technology. They will permit real time decisions regarding beach closures. The methods are supported by a series of epidemiology studies evaluating the rate of GI illness resulting from swimming events. Implementation of BEACH Act amendments

  4. Quantitative scintigraphy in diagnosis and management of plantar fasciitis (Calcaneal periostitis): concise communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, J.R.; Black, C.M.; Chapman, A.H.

    1980-07-01

    We have found that Tc-99m methylene diphosphonate imaging of the heel is of diagnostic value in the painful heel syndrome, permitting positive identification of the site of inflammation in cases where radiography is unhelpful. With this technique, tracer uptake in the heel is susceptible to quantification, allowing a serial and objective assessment of response to therapy.

  5. 76 FR 36608 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-22

    .... This model will enable the Sub-Adviser to evaluate, rank, and select the appropriate mix of investments... becoming, risky. The Sub-Adviser will use a quantitative metric to rank and select the appropriate mix of... imposes a duty of due diligence on its Equity Trading Permit Holders to learn the essential facts relating...

  6. From crystal chemistry to colloid stability

    NASA Astrophysics Data System (ADS)

    Gilbert, B.; Burrows, N.; Penn, R. L.

    2008-12-01

    Aqueous suspensions of ferrihydrite nanoparticles form a colloid with properties that can be understood using classical theories but which additionally exhibit the distinctive phenomenon of nanocluster formation. While use of in situ light and x-ray scattering methods permit the quantitative determination of colloid stability, interparticle interactions, and cluster or aggregate geometry, there are currently few approaches to predict the colloidal behavior of mineral nanoparticles. A longstanding goal of aqueous geochemistry is the rationalization and prediction of the chemical properties of hydrated mineral interfaces from knowledge of interface structure at the molecular scale. Because interfacial acid-base reactions typically lead to the formation of a net electrostatic charge at the surfaces of oxide, hydroxide, and oxyhydroxide mineral surfaces, quantitative descriptions of this behavior have the potential to permit the prediction of long-range interactions between mineral particles. We will evaluate the feasibility of this effort by constructing a model for surface charge formation for ferrihydrite that combines recent insights into the crystal structure of this phase and proposed methods for estimating the pKa of acidic surface groups. We will test the ability of this model to predict the colloidal stability of ferrihydrite suspensions as a function of solution chemistry.

  7. Quantitative Multispectral Analysis Of Discrete Subcellular Particles By Digital Imaging Fluorescence Microscopy (DIFM)

    NASA Astrophysics Data System (ADS)

    Dorey, C. K.; Ebenstein, David B.

    1988-10-01

    Subcellular localization of multiple biochemical markers is readily achieved through their characteristic autofluorescence or through use of appropriately labelled antibodies. Recent development of specific probes has permitted elegant studies in calcium and pH in living cells. However, each of these methods measured fluorescence at one wavelength; precise quantitation of multiple fluorophores at individual sites within a cell has not been possible. Using DIFM, we have achieved spectral analysis of discrete subcellular particles 1-2 gm in diameter. The fluorescence emission is broken into narrow bands by an interference monochromator and visualized through the combined use of a silicon intensified target (SIT) camera, a microcomputer based framegrabber with 8 bit resolution, and a color video monitor. Image acquisition, processing, analysis and display are under software control. The digitized image can be corrected for the spectral distortions induced by the wavelength dependent sensitivity of the camera, and the displayed image can be enhanced or presented in pseudocolor to facilitate discrimination of variation in pixel intensity of individual particles. For rapid comparison of the fluorophore composition of granules, a ratio image is produced by dividing the image captured at one wavelength by that captured at another. In the resultant ratio image, a granule which has a fluorophore composition different from the majority is selectively colored. This powerful system has been utilized to obtain spectra of endogenous autofluorescent compounds in discrete cellular organelles of human retinal pigment epithelium, and to measure immunohistochemically labelled components of the extracellular matrix associated with the human optic nerve.

  8. Quantitative Analysis of Thermal Anomalies in the DFDP-2B Borehole, New Zealand

    NASA Astrophysics Data System (ADS)

    Janků-Čápová, Lucie; Sutherland, Rupert; Townend, John

    2017-04-01

    The DFDP-2B borehole, which was drilled in the Whataroa Valley, South Island, New Zealand in late 2014, provides a unique opportunity to study the conditions in the hanging wall of a plate boundary fault, the Alpine Fault, which is late in its seismic cycle. High geothermal gradient of > 125°C/km encountered in the borehole drew attention to the thermal structure of the valley, as well as of the Alpine Fault's hanging wall as a whole. A detailed analysis of temperature logs measured during drilling of the DFDP-2B borehole, reveals two distinct portions of the signal containing information on different processes. The long-wavelength portion of the temperature signal, i.e. the overall trend (hundreds of metres), reflects the response of the rock environment to the disturbance caused by drilling and permits an estimation of the thermal diffusivity of the rock based on the rate of temperature recovery. The short-wavelength (tens of metres to tens of centimetres) signal represents the local anomalies caused by lithological variations or, more importantly, by fluid flow into or out of the borehole along fractures. By analysing these distinct features, we can identify anomalous zones that manifest in other wireline data (resistivity, BHTV) and are likely attributable to permeable fractures. Here we present a novel method of quantitative analysis of the short-wavelength temperature anomalies. This method provides a precise and objective way to determine the position, width and amplitude of temperature anomalies and facilitates the interpretation of temperature logs, which is of a particular importance in estimation of flow in a fractured aquifer.

  9. Methods and theory in bone modeling drift: comparing spatial analyses of primary bone distributions in the human humerus.

    PubMed

    Maggiano, Corey M; Maggiano, Isabel S; Tiesler, Vera G; Chi-Keb, Julio R; Stout, Sam D

    2016-01-01

    This study compares two novel methods quantifying bone shaft tissue distributions, and relates observations on human humeral growth patterns for applications in anthropological and anatomical research. Microstructural variation in compact bone occurs due to developmental and mechanically adaptive circumstances that are 'recorded' by forming bone and are important for interpretations of growth, health, physical activity, adaptation, and identity in the past and present. Those interpretations hinge on a detailed understanding of the modeling process by which bones achieve their diametric shape, diaphyseal curvature, and general position relative to other elements. Bone modeling is a complex aspect of growth, potentially causing the shaft to drift transversely through formation and resorption on opposing cortices. Unfortunately, the specifics of modeling drift are largely unknown for most skeletal elements. Moreover, bone modeling has seen little quantitative methodological development compared with secondary bone processes, such as intracortical remodeling. The techniques proposed here, starburst point-count and 45° cross-polarization hand-drawn histomorphometry, permit the statistical and populational analysis of human primary tissue distributions and provide similar results despite being suitable for different applications. This analysis of a pooled archaeological and modern skeletal sample confirms the importance of extreme asymmetry in bone modeling as a major determinant of microstructural variation in diaphyses. Specifically, humeral drift is posteromedial in the human humerus, accompanied by a significant rotational trend. In general, results encourage the usage of endocortical primary bone distributions as an indicator and summary of bone modeling drift, enabling quantitative analysis by direction and proportion in other elements and populations. © 2015 Anatomical Society.

  10. Traditional healers and the potential for collaboration with the national tuberculosis programme in Vanuatu: results from a mixed methods study

    PubMed Central

    2014-01-01

    Background This study was conducted in the Pacific island nation of Vanuatu. Our objective was to assess knowledge, attitudes and practice of traditional healers who treat lung diseases and tuberculosis (TB), including their willingness to collaborate with the national TB programme. Methods This was a descriptive study using both qualitative and quantitative methods. Quantitative analysis was based on the responses provided to closed-ended questions, and we used descriptive analysis (frequencies) to describe the knowledge, attitudes and practice of the traditional healers towards TB. Qualitative analysis was based on open-ended questions permitting fuller explanations. We used thematic analysis and developed a posteriori inductive categories to draw original and unbiased conclusions. Results Nineteen traditional healers were interviewed; 18 were male. Fifteen of the healers reported treating short wind (a local term to describe lung, chest or breathing illnesses) which they attributed to food, alcohol, smoking or pollution from contact with menstrual blood, and a range of other physical and spiritual causes. Ten said that they would treat TB with leaf medicine. Four traditional healers said that they would not treat TB. Twelve of the healers had referred someone to a hospital for a strong wet-cough and just over half of the healers (9) reported a previous collaboration with the Government health care system. Eighteen of the traditional healers would be willing to collaborate with the national TB programme, with or without compensation. Conclusions Traditional healers in Vanuatu treat lung diseases including TB. Many have previously collaborated with the Government funded health care system, and almost all of them indicated a willingness to collaborate with the national TB programme. The engagement of traditional healers in TB management should be considered, using an evidence based and culturally sensitive approach. PMID:24758174

  11. Methodology for cork plank characterization (Quercus suber L.) by near-infrared spectroscopy and image analysis

    NASA Astrophysics Data System (ADS)

    Prades, Cristina; García-Olmo, Juan; Romero-Prieto, Tomás; García de Ceca, José L.; López-Luque, Rafael

    2010-06-01

    The procedures used today to characterize cork plank for the manufacture of cork bottle stoppers continue to be based on a traditional, manual method that is highly subjective. Furthermore, there is no specific legislation regarding cork classification. The objective of this viability study is to assess the potential of near-infrared spectroscopy (NIRS) technology for characterizing cork plank according to the following variables: aspect or visual quality, porosity, moisture and geographical origin. In order to calculate the porosity coefficient, an image analysis program was specifically developed in Visual Basic language for a desktop scanner. A set comprising 170 samples from two geographical areas of Andalusia (Spain) was classified into eight quality classes by visual inspection. Spectra were obtained in the transverse and tangential sections of the cork planks using an NIRSystems 6500 SY II reflectance spectrophotometer. The quantitative calibrations showed cross-validation coefficients of determination of 0.47 for visual quality, 0.69 for porosity and 0.66 for moisture. The results obtained using NIRS technology are promising considering the heterogeneity and variability of a natural product such as cork in spite of the fact that the standard error of cross validation (SECV) in the quantitative analysis is greater than the standard error of laboratory (SEL) for the three variables. The qualitative analysis regarding geographical origin achieved very satisfactory results. Applying these methods in industry will permit quality control procedures to be automated, as well as establishing correlations between the different classification systems currently used in the sector. These methods can be implemented in the cork chain of custody certification and will also provide a certainly more objective tool for assessing the economic value of the product.

  12. Forensic applications of ambient ionization mass spectrometry.

    PubMed

    Ifa, Demian R; Jackson, Ayanna U; Paglia, Giuseppe; Cooks, R Graham

    2009-08-01

    This review highlights and critically assesses forensic applications in the developing field of ambient ionization mass spectrometry. Ambient ionization methods permit the ionization of samples outside the mass spectrometer in the ordinary atmosphere, with minimal sample preparation. Several ambient ionization methods have been created since 2004 and they utilize different mechanisms to create ions for mass-spectrometric analysis. Forensic applications of these techniques--to the analysis of toxic industrial compounds, chemical warfare agents, illicit drugs and formulations, explosives, foodstuff, inks, fingerprints, and skin--are reviewed. The minimal sample pretreatment needed is illustrated with examples of analysis from complex matrices (e.g., food) on various substrates (e.g., paper). The low limits of detection achieved by most of the ambient ionization methods for compounds of forensic interest readily offer qualitative confirmation of chemical identity; in some cases quantitative data are also available. The forensic applications of ambient ionization methods are a growing research field and there are still many types of applications which remain to be explored, particularly those involving on-site analysis. Aspects of ambient ionization currently undergoing rapid development include molecular imaging and increased detection specificity through simultaneous chemical reaction and ionization by addition of appropriate chemical reagents.

  13. New insights on the Dronino iron meteorite by double-pulse micro-Laser-Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Tempesta, Gioacchino; Senesi, Giorgio S.; Manzari, Paola; Agrosì, Giovanna

    2018-06-01

    Two fragments of an iron meteorite shower named Dronino were characterized by a novel technique, i.e. Double-Pulse micro-Laser Induced Breakdown Spectroscopy (DP-μLIBS) combined with optical microscope. This technique allowed to perform a fast and detailed analysis of the chemical composition of the fragments and permitted to determine their composition, the alteration state differences and the cooling rate of the meteorite. Qualitative analysis indicated the presence of Fe, Ni and Co in both fragments, whereas the elements Al, Ca, Mg, Si and, for the first time Li, were detected only in one fragment and were related to its post-falling alteration and contamination by weathering processes. Quantitative analysis data obtained using the calibration-free (CF) - LIBS method showed a good agreement with those obtained by traditional methods generally applied to meteorite analysis, i.e. Electron Dispersion Spectroscopy - Scanning Electron Microscopy (EDS-SEM), also performed in this study, and Electron Probe Microanalysis (EMPA) (literature data). The local and coupled variability of Ni and Co (increase of Ni and decrease of Co) determined for the unaltered portions exhibiting plessite texture, suggested the occurrence of solid state diffusion processes under a slow cooling rate for the Dronino meteorite.

  14. Next-generation sequencing facilitates quantitative analysis of wild-type and Nrl−/− retinal transcriptomes

    PubMed Central

    Brooks, Matthew J.; Rajasimha, Harsha K.; Roger, Jerome E.

    2011-01-01

    Purpose Next-generation sequencing (NGS) has revolutionized systems-based analysis of cellular pathways. The goals of this study are to compare NGS-derived retinal transcriptome profiling (RNA-seq) to microarray and quantitative reverse transcription polymerase chain reaction (qRT–PCR) methods and to evaluate protocols for optimal high-throughput data analysis. Methods Retinal mRNA profiles of 21-day-old wild-type (WT) and neural retina leucine zipper knockout (Nrl−/−) mice were generated by deep sequencing, in triplicate, using Illumina GAIIx. The sequence reads that passed quality filters were analyzed at the transcript isoform level with two methods: Burrows–Wheeler Aligner (BWA) followed by ANOVA (ANOVA) and TopHat followed by Cufflinks. qRT–PCR validation was performed using TaqMan and SYBR Green assays. Results Using an optimized data analysis workflow, we mapped about 30 million sequence reads per sample to the mouse genome (build mm9) and identified 16,014 transcripts in the retinas of WT and Nrl−/− mice with BWA workflow and 34,115 transcripts with TopHat workflow. RNA-seq data confirmed stable expression of 25 known housekeeping genes, and 12 of these were validated with qRT–PCR. RNA-seq data had a linear relationship with qRT–PCR for more than four orders of magnitude and a goodness of fit (R2) of 0.8798. Approximately 10% of the transcripts showed differential expression between the WT and Nrl−/− retina, with a fold change ≥1.5 and p value <0.05. Altered expression of 25 genes was confirmed with qRT–PCR, demonstrating the high degree of sensitivity of the RNA-seq method. Hierarchical clustering of differentially expressed genes uncovered several as yet uncharacterized genes that may contribute to retinal function. Data analysis with BWA and TopHat workflows revealed a significant overlap yet provided complementary insights in transcriptome profiling. Conclusions Our study represents the first detailed analysis of retinal transcriptomes, with biologic replicates, generated by RNA-seq technology. The optimized data analysis workflows reported here should provide a framework for comparative investigations of expression profiles. Our results show that NGS offers a comprehensive and more accurate quantitative and qualitative evaluation of mRNA content within a cell or tissue. We conclude that RNA-seq based transcriptome characterization would expedite genetic network analyses and permit the dissection of complex biologic functions. PMID:22162623

  15. Risk Reduction Modeling of High Pathogenicity Avian Influenza Virus Titers in Nonpasteurized Liquid Egg Obtained from Infected but Undetected Chicken Flocks.

    PubMed

    Weaver, J Todd; Malladi, Sasidhar; Spackman, Erica; Swayne, David E

    2015-11-01

    Control of highly pathogenic avian influenza (HPAI) outbreaks in poultry has traditionally involved the establishment of disease containment zones, where poultry products are only permitted to move from within a zone under permit. Nonpasteurized liquid egg (NPLE) is one such commodity for which movements may be permitted, considering inactivation of HPAI virus via pasteurization. Active surveillance testing at the flock level, using targeted matrix gene real-time reversed transcriptase-polymerase chain reaction testing (RRT-PCR) has been incorporated into HPAI emergency response plans as the primary on-farm diagnostic test procedure to detect HPAI in poultry and is considered to be a key risk mitigation measure. To inform decisions regarding the potential movement of NPLE to a pasteurization facility, average HPAI virus concentrations in NPLE produced from a HPAI virus infected, but undetected, commercial table-egg-layer flock were estimated for three HPAI virus strains using quantitative simulation models. Pasteurization under newly proposed international design standards (5 log10 reduction) is predicted to inactivate HPAI virus in NPLE to a very low concentration of less than 1 embryo infectious dose (EID)50 /mL, considering the predicted virus titers in NPLE from a table-egg flock under active surveillance. Dilution of HPAI virus from contaminated eggs in eggs from the same flock, and in a 40,000 lb tanker-truck load of NPLE containing eggs from disease-free flocks was also considered. Risk assessment can be useful in the evaluation of commodity-specific risk mitigation measures to facilitate safe trade in animal products from countries experiencing outbreaks of highly transmissible animal diseases. © 2015 Society for Risk Analysis.

  16. Report: Substantial Changes Needed in Implementation and Oversight of Title V Permits If Program Goals Are To Be Fully Realized

    EPA Pesticide Factsheets

    Report #2005-P-00010, March 9, 2005. Our analysis identified concerns with five key aspects of Title V permits, including permit clarity, statements of basis, monitoring provisions, annual compliance certifications, and practical enforceability.

  17. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  18. Capture of the volatile carbonyl metabolite of flecainide on 2,4-dinitrophenylhydrazine cartridge for quantitation by stable-isotope dilution mass spectrometry coupled with chromatography

    PubMed Central

    Prokai, Laszlo; Szarka, Szabolcs; Wang, Xiaoli; Prokai-Tatrai, Katalin

    2012-01-01

    Carbonyl compounds are common byproducts of many metabolic processes. These volatile chemical entities are usually derivatized before mass spectrometric analysis to enhance the sensitivity of their detections. The classically used reagent for this purpose is 2,4-dinitrophenylhydrazine (DNPH) that forms the corresponding hydrazones. When DNPH is immobilized on specific cartridges it permits solvent-free collection and simultaneous derivatization of aldehydes and ketones from gaseous samples. The utility of this approach was tested by assembling a simple apparatus for the in vitro generation of trifluoroacetaldehyde (TFAA) and its subsequent capture on the attached DNPH cartridge. TFAA was generated via cytochrome P450-catalyzed dealkylation of flecainide, an antiarrhythmic agent, in pooled human liver microsomes. Stable-isotope dilution mass spectrometry coupled with GC and LC using negative chemical ionization (NCI) and electrospray ionization (ESI) was evaluated for quantitative analyses. To eliminate isotope effects observed with the use of deuterium-labeled DNPH, we selected its 15N4-labeled analog to synthesize the appropriate TFAA adduct, as internal standard. Quantitation by GC–NCI-MS using selected-ion monitoring outperformed LC–ESI-MS methods considering limits of detection and linearity of the assays. The microsomal metabolism of 1.5 μmol of flecainide for 1.5 h resulted in 2.6 ± 0.5 μg TFAA-DNPH, corresponding to 9.3 ± 1.7 nmol TFAA, captured by the cartridge. PMID:22342210

  19. Capture of the volatile carbonyl metabolite of flecainide on 2,4-dinitrophenylhydrazine cartridge for quantitation by stable-isotope dilution mass spectrometry coupled with chromatography.

    PubMed

    Prokai, Laszlo; Szarka, Szabolcs; Wang, Xiaoli; Prokai-Tatrai, Katalin

    2012-04-06

    Carbonyl compounds are common byproducts of many metabolic processes. These volatile chemicals are usually derivatized before mass spectrometric analysis to enhance the sensitivity of their detections. The classically used reagent for this purpose is 2,4-dinitrophenylhydrazine (DNPH) that forms the corresponding hydrazones. When DNPH is immobilized on specific cartridges it permits solvent-free collection and simultaneous derivatization of aldehydes and ketones from gaseous samples. The utility of this approach was tested by assembling a simple apparatus for the in vitro generation of trifluoroacetaldehyde (TFAA) and its subsequent capture on the attached DNPH cartridge. TFAA was generated via cytochrome P450-catalyzed dealkylation of flecainide, an antiarrhythmic agent, in pooled human liver microsomes. Stable-isotope dilution mass spectrometry coupled with GC and LC using negative chemical ionization (NCI) and electrospray ionization (ESI) was evaluated for quantitative analyses. To eliminate isotope effects observed with the use of deuterium-labeled DNPH, we selected its (15)N(4)-labeled analog to synthesize the appropriate TFAA adduct, as internal standard. Quantitation by GC-NCI-MS using selected-ion monitoring outperformed LC-ESI-MS methods considering limits of detection and linearity of the assays. The microsomal metabolism of 1.5 μmol of flecainide for 1.5h resulted in 2.6 ± 0.5 μg TFAA-DNPH, corresponding to 9.3 ± 1.7 nmol TFAA, captured by the cartridge. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. A Combined XRD/XRF Instrument for Lunar Resource Assessment

    NASA Technical Reports Server (NTRS)

    Vaniman, D. T.; Bish, D. L.; Chipera, S. J.; Blacic, J. D.

    1992-01-01

    Robotic surface missions to the Moon should be capable of measuring mineral as well as chemical abundances in regolith samples. Although much is already known about the lunar regolith, our data are far from comprehensive. Most of the regolith samples returned to Earth for analysis had lost the upper surface, or it was intermixed with deeper regolith. This upper surface is the part of the regolith most recently exposed to the solar wind; as such it will be important to resource assessment. In addition, it may be far easier to mine and process the uppermost few centimeters of regolith over a broad area than to engage in deep excavation of a smaller area. The most direct means of analyzing the regolith surface will be by studies in situ. In addition, the analysis of the impact-origin regolith surfaces, the Fe-rich glasses of mare pyroclastic deposits, are of resource interest, but are inadequately known; none of the extensive surface-exposed pyroclastic deposits of the Moon have been systematically sampled, although we know something about such deposits from the Apollo 17 site. Because of the potential importance of pyroclastic deposits, methods to quantify glass as well as mineral abundances will be important to resource evaluation. Combined x ray diffraction (XRD) and x ray fluorescence (XRF) analysis will address many resource characterization problems on the Moon. XRF methods are valuable for obtaining full major-element abundances with high precision. Such data, collected in parallel with quantitative mineralogy, permit unambiguous determination of both mineral and chemical abundances where concentrations are high enough to be of resource grade. Collection of both XRD and XRF data from a single sample provides simultaneous chemical and mineralogic information. These data can be used to correlate quantitative chemistry and mineralogy as a set of simultaneous linear equations, the solution of which can lead to full characterization of the sample. The use of Rietveld methods for XRD data analysis can provide a powerful tool for quantitative mineralogy and for obtaining crystallographic data on complex minerals.

  1. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less

  2. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States

    DOE PAGES

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; ...

    2017-04-26

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less

  3. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States.

    PubMed

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; Heath, Garvin

    2017-06-06

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain major source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called "major" or "minor") has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.

  4. Differential diagnosis of lung carcinoma with three-dimensional quantitative molecular vibrational imaging

    NASA Astrophysics Data System (ADS)

    Gao, Liang; Hammoudi, Ahmad A.; Li, Fuhai; Thrall, Michael J.; Cagle, Philip T.; Chen, Yuanxin; Yang, Jian; Xia, Xiaofeng; Fan, Yubo; Massoud, Yehia; Wang, Zhiyong; Wong, Stephen T. C.

    2012-06-01

    The advent of molecularly targeted therapies requires effective identification of the various cell types of non-small cell lung carcinomas (NSCLC). Currently, cell type diagnosis is performed using small biopsies or cytology specimens that are often insufficient for molecular testing after morphologic analysis. Thus, the ability to rapidly recognize different cancer cell types, with minimal tissue consumption, would accelerate diagnosis and preserve tissue samples for subsequent molecular testing in targeted therapy. We report a label-free molecular vibrational imaging framework enabling three-dimensional (3-D) image acquisition and quantitative analysis of cellular structures for identification of NSCLC cell types. This diagnostic imaging system employs superpixel-based 3-D nuclear segmentation for extracting such disease-related features as nuclear shape, volume, and cell-cell distance. These features are used to characterize cancer cell types using machine learning. Using fresh unstained tissue samples derived from cell lines grown in a mouse model, the platform showed greater than 97% accuracy for diagnosis of NSCLC cell types within a few minutes. As an adjunct to subsequent histology tests, our novel system would allow fast delineation of cancer cell types with minimum tissue consumption, potentially facilitating on-the-spot diagnosis, while preserving specimens for additional tests. Furthermore, 3-D measurements of cellular structure permit evaluation closer to the native state of cells, creating an alternative to traditional 2-D histology specimen evaluation, potentially increasing accuracy in diagnosing cell type of lung carcinomas.

  5. 4-D segmentation and normalization of 3He MR images for intrasubject assessment of ventilated lung volumes

    NASA Astrophysics Data System (ADS)

    Contrella, Benjamin; Tustison, Nicholas J.; Altes, Talissa A.; Avants, Brian B.; Mugler, John P., III; de Lange, Eduard E.

    2012-03-01

    Although 3He MRI permits compelling visualization of the pulmonary air spaces, quantitation of absolute ventilation is difficult due to confounds such as field inhomogeneity and relative intensity differences between image acquisition; the latter complicating longitudinal investigations of ventilation variation with respiratory alterations. To address these potential difficulties, we present a 4-D segmentation and normalization approach for intra-subject quantitative analysis of lung hyperpolarized 3He MRI. After normalization, which combines bias correction and relative intensity scaling between longitudinal data, partitioning of the lung volume time series is performed by iterating between modeling of the combined intensity histogram as a Gaussian mixture model and modulating the spatial heterogeneity tissue class assignments through Markov random field modeling. Evaluation of the algorithm was retrospectively applied to a cohort of 10 asthmatics between 19-25 years old in which spirometry and 3He MR ventilation images were acquired both before and after respiratory exacerbation by a bronchoconstricting agent (methacholine). Acquisition was repeated under the same conditions from 7 to 467 days (mean +/- standard deviation: 185 +/- 37.2) later. Several techniques were evaluated for matching intensities between the pre and post-methacholine images with the 95th percentile value histogram matching demonstrating superior correlations with spirometry measures. Subsequent analysis evaluated segmentation parameters for assessing ventilation change in this cohort. Current findings also support previous research that areas of poor ventilation in response to bronchoconstriction are relatively consistent over time.

  6. Addressing the amorphous content issue in quantitative phase analysis: the certification of NIST standard reference material 676a.

    PubMed

    Cline, James P; Von Dreele, Robert B; Winburn, Ryan; Stephens, Peter W; Filliben, James J

    2011-07-01

    A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum (α-Al(2)O(3)) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Under the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% ± 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.

  7. Control of Toxic Chemicals in Puget Sound, Phase 3: Study Of Atmospheric Deposition of Air Toxics to the Surface of Puget Sound

    DTIC Science & Technology

    2007-01-01

    deposition directly to Puget Sound was an important source of PAHs, polybrominated diphenyl ethers (PBDEs), and heavy metals . In most cases, atmospheric...versus Atmospheric Fluxes ........................................................................66  PAH Source Apportionment ...temperature inversions) on air quality during the wet season. A semi-quantitative apportionment study permitted a first-order characterization of source

  8. Sweat collection capsule

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Delaplaine, R. W. (Inventor)

    1980-01-01

    A sweat collection capsule permitting quantitative collection of sweat is described. The device consists of a frame held immobile on the skin, a closure secured to the frame and absorbent material located next to the skin in a cavity formed by the frame and the closure. The absorbent material may be removed from the device by removing the closure from the frame while the frame is held immobile on the skin.

  9. Structural design criteria applicable to a space shuttle

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The structural criteria are limited to general and mission-oriented criteria and are not configuration specific. Care was taken to ensure that the criteria will not restrict configuration development and will not establish the overall risk level. In some instances, margins of confidence are indicated, not only because experience shows them to be necessary but also because technology now permits quantitative values to be established.

  10. Analysis of unmitigated large break loss of coolant accidents using MELCOR code

    NASA Astrophysics Data System (ADS)

    Pescarini, M.; Mascari, F.; Mostacci, D.; De Rosa, F.; Lombardo, C.; Giannetti, F.

    2017-11-01

    In the framework of severe accident research activity developed by ENEA, a MELCOR nodalization of a generic Pressurized Water Reactor of 900 MWe has been developed. The aim of this paper is to present the analysis of MELCOR code calculations concerning two independent unmitigated large break loss of coolant accident transients, occurring in the cited type of reactor. In particular, the analysis and comparison between the transients initiated by an unmitigated double-ended cold leg rupture and an unmitigated double-ended hot leg rupture in the loop 1 of the primary cooling system is presented herein. This activity has been performed focusing specifically on the in-vessel phenomenology that characterizes this kind of accidents. The analysis of the thermal-hydraulic transient phenomena and the core degradation phenomena is therefore here presented. The analysis of the calculated data shows the capability of the code to reproduce the phenomena typical of these transients and permits their phenomenological study. A first sequence of main events is here presented and shows that the cold leg break transient results faster than the hot leg break transient because of the position of the break. Further analyses are in progress to quantitatively assess the results of the code nodalization for accident management strategy definition and fission product source term evaluation.

  11. Fraunhofer line-dept sensing applied to water

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.

    1969-01-01

    An experimental Fraunhofer line discriminator is basically an airborne fluorometer, capable of quantitatively measuring the concentration of fluorescent substances dissolved in water. It must be calibrated against standards and supplemented by ground-truth data on turbidity and on approximate vertical distribution of the fluorescent substance. Quantitative use requires that it be known in advance what substance is the source of the luminescence emission; qualitative sensing, or detection of luminescence is also possible. The two approaches are fundamentally different, having different purposes, different applications, and different instruments. When used for sensing of Rhodamine WT dye in coastal waters and estuaries, the FLD is sensing in the spectral region permitting nearly maximum depth of light penetration.

  12. Localized Chemical Remodeling for Live Cell Imaging of Protein-Specific Glycoform.

    PubMed

    Hui, Jingjing; Bao, Lei; Li, Siqiao; Zhang, Yi; Feng, Yimei; Ding, Lin; Ju, Huangxian

    2017-07-03

    Live cell imaging of protein-specific glycoforms is important for the elucidation of glycosylation mechanisms and identification of disease states. The currently used metabolic oligosaccharide engineering (MOE) technology permits routinely global chemical remodeling (GCM) for carbohydrate site of interest, but can exert unnecessary whole-cell scale perturbation and generate unpredictable metabolic efficiency issue. A localized chemical remodeling (LCM) strategy for efficient and reliable access to protein-specific glycoform information is reported. The proof-of-concept protocol developed for MUC1-specific terminal galactose/N-acetylgalactosamine (Gal/GalNAc) combines affinity binding, off-on switchable catalytic activity, and proximity catalysis to create a reactive handle for bioorthogonal labeling and imaging. Noteworthy assay features associated with LCM as compared with MOE include minimum target cell perturbation, short reaction timeframe, effectiveness as a molecular ruler, and quantitative analysis capability. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. a Migration Well Model for the Binding of Ligands to Heme Proteins.

    NASA Astrophysics Data System (ADS)

    Beece, Daniel Kenneth

    The binding of carbon monoxide and dioxygen to heme proteins can be viewed as occurring in distinct stages: diffusion in the solvent, migration through the matrix, and occupation of the pocket before the final binding step. A model is presented which can explain the dominant kinetic behavior of several different heme protein-ligand systems. The model assumes that a ligand molecule in the solvent sequentially encounters discrete energy barriers on the way to the binding site. The rate to surmount each barrier is distributed, except for the pseudofirst order rate corresponding to the step into the protein from the solvent. The migration through the matrix is equivalent to a small number of distinct jumps. Quantitative analysis of the data permit estimates of the barrier heights, preexponentials and solvent coupling factors for each rate. A migration coefficient and a matrix occupation factor are defined.

  14. Cost benefit analysis of the transfer of NASA remote sensing technology to the state of Georgia

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P. (Principal Investigator); Wilkins, R. D.; Kelly, D. L.; Brown, D. M.

    1977-01-01

    The author has identified the following significant results. First order benefits can generally be quantified, thus allowing quantitative comparisons of candidate land cover data systems. A meaningful dollar evaluation of LANDSAT can be made by a cost comparison with equally effective data systems. Users of LANDSAT data can be usefully categorized as performing three general functions: planning, permitting, and enforcing. The value of LANDSAT data to the State of Georgia is most sensitive to the parameters: discount rate, digitization cost, and photo acquisition cost. Under a constrained budget, LANDSAT could provide digitized land cover information roughly seven times more frequently than could otherwise be obtained. Thus on one hand, while the services derived from LANDSAT data in comparison to the baseline system has a positive net present value, on the other hand if the budget were constrained, more frequent information could be provided using the LANDSAT system than otherwise be obtained.

  15. [Knowledge transfer : pious hopes or how to evade the impasse?].

    PubMed

    Wälti-Bolliger, Marianne; Needham, Ian; Halfens, Ruud

    2007-09-01

    The utilisation of research results in nursing is demanded by a new legislation in Switzerland. Scientific literature on the diffusion of nursing research results demonstrates that this issue is difficult and complex. This study explores the perceptions of practice-based nursing tutors of nursing students in the French speaking region of Switzerland concerning such transfer practice. The aim is to find indicators which may permit adequate decision making in order to facilitate nursing practice based on scientific research results. The Barriers scale of Funk et al. which refers to Roger's diffusion theory was employed as the measurement instrument. The data were subjected to quantitative and qualitative analysis. The results demonstrate the obstacles with regards to the practice-based nursing tutors themselves, to the professional environment, to research, and to paths of communication. Various suggestions are raised by the respondents in all these areas.

  16. Moving Beyond “China in Africa”: Insights from Zambian Immigration Data

    PubMed Central

    Postel, Hannah

    2017-01-01

    China’s growing presence in Africa is not news: the expansion of bilateral trade and investment ties has garnered intense media and political focus over the past decade. However, less is known about the people accompanying these increasingly intensive flows of goods and capital. This paper focuses on Zambia, drawing on multiple primary datasets to shed light on both the scale and nature of Chinese migration to the continent. Two years of Department of Immigration employment-permit data serve as the basis for the first quantitative analysis of the “Chinese” in “Africa,” illuminating the increasing diversity of this population flow. While the growing Chinese presence in Africa is often viewed as a coherent neocolonialist strategy planned and implemented by the Chinese state, this paper demonstrates that it is in fact typified by a multitude of both public and private actors with independent motives. PMID:29151983

  17. [The implementation of computer model in research of dynamics of proliferation of cells of thyroid gland follicle].

    PubMed

    Abduvaliev, A A; Gil'dieva, M S; Khidirov, B N; Saĭdalieva, M; Khasanov, A A; Musaeva, Sh N; Saatov, T S

    2012-04-01

    The article deals with the results of computational experiments in research of dynamics of proliferation of cells of thyroid gland follicle in normal condition and in the case of malignant neoplasm. The model studies demonstrated that the chronic increase of parameter of proliferation of cells of thyroid gland follicle results in abnormal behavior of numbers of cell cenosis of thyroid gland follicle. The stationary state interrupts, the auto-oscillations occur with transition to irregular oscillations with unpredictable cell proliferation and further to the "black hole" effect. It is demonstrated that the present medical biologic experimental data and theory propositions concerning the structural functional organization of thyroid gland on cell level permit to develop mathematical models for quantitative analysis of numbers of cell cenosis of thyroid gland follicle in normal conditions. The technique of modeling of regulative mechanisms of living systems and equations of cell cenosis regulations was used

  18. Quantitative three-dimensional ice roughness from scanning electron microscopy

    NASA Astrophysics Data System (ADS)

    Butterfield, Nicholas; Rowe, Penny M.; Stewart, Emily; Roesel, David; Neshyba, Steven

    2017-03-01

    We present a method for inferring surface morphology of ice from scanning electron microscope images. We first develop a novel functional form for the backscattered electron intensity as a function of ice facet orientation; this form is parameterized using smooth ice facets of known orientation. Three-dimensional representations of rough surfaces are retrieved at approximately micrometer resolution using Gauss-Newton inversion within a Bayesian framework. Statistical analysis of the resulting data sets permits characterization of ice surface roughness with a much higher statistical confidence than previously possible. A survey of results in the range -39°C to -29°C shows that characteristics of the roughness (e.g., Weibull parameters) are sensitive not only to the degree of roughening but also to the symmetry of the roughening. These results suggest that roughening characteristics obtained by remote sensing and in situ measurements of atmospheric ice clouds can potentially provide more facet-specific information than has previously been appreciated.

  19. Gene expression profiling in respond to TBT exposure in small abalone Haliotis diversicolor.

    PubMed

    Jia, Xiwei; Zou, Zhihua; Wang, Guodong; Wang, Shuhong; Wang, Yilei; Zhang, Ziping

    2011-10-01

    In this study, we investigated the gene expression profiling of small abalone, Haliotis diversicolor by tributyltin (TBT) exposure using a cDNA microarray containing 2473 unique transcripts. Totally, 107 up-regulated genes and 41 down-regulated genes were found. For further investigation of candidate genes from microarray data and EST analysis, quantitative real-time PCR was performed at 6 h, 24 h, 48 h, 96 h and 192 h TBT exposure. 26 genes were found to be significantly differentially expressed in different time course, 3 of them were unknown. Some gene homologues like cellulose, endo-beta-1,4-glucanase, ferritin subunit 1 and thiolester containing protein II CG7052-PB might be the good biomarker candidate for TBT monitor. The identification of stress response genes and their expression profiles will permit detailed investigation of the defense responses of small abalone genes. Published by Elsevier Ltd.

  20. Apparatus For Measuring The Concentration Of A Species At A Distance

    DOEpatents

    Rice, Steven F.; Allendorf, Mark D.

    2006-04-11

    Corrosion of refractory silica brick and air quality issues due to particulate emissions are two important glass manufacturing issues that have been tied to sodium vapor and its transport throughout the melt tank. Knowledge of the relationship between tank operating conditions and tank atmosphere sodium levels are therefore important considerations in correcting corrosion and air quality issues. However, until recently direct quantitative measurements of sodium levels has been limited to extractive sampling methods followed by laboratory analysis. Excimer laser induced fragmentation (ELIF) fluorescence spectroscopy is a technique that permits the measurement of volatilized NaOH in high temperature environments on a timescale of less than one second. The development of this method and the construction of field-portable instrumentation for glass furnace applications are herein disclosed. The method is shown to be effective in full-scale industrial settings. Characteristics of the method are outlined, including equipment configuration, detection sensitivity, and calibration methodology.

  1. Simultaneous Determination of Preservatives in Dairy Products by HPLC and Chemometric Analysis

    PubMed Central

    Zamani Mazdeh, Fatemeh; Sasanfar, Sima; Chalipour, Anita; Pirhadi, Elham; Yahyapour, Ghazal; Mohammadi, Armin; Rostami, Akram; Amini, Mohsen

    2017-01-01

    Cheese and yogurt are two kinds of nutritious dairy products that are used worldwide. The major preservatives in dairy products are sodium benzoate, potassium sorbate, and natamycin. The maximum permitted levels for these additives in cheese and yogurt are established according to Iranian national standards. In this study, we developed a method to detect these preservatives in dairy products by reversed phase chromatography with UV detection in 220 nm, simultaneously. This method was performed on C18 column with ammonium acetate buffer (pH = 5) and acetonitrile (73 : 27 v/v) as mobile phase. The method was carried out on 195 samples in 5 kinds of commercial cheeses and yogurts. The results demonstrated insufficient separation where limit of detection (LOD) and limit of quantitation (LOQ) ranged from 0.326 to 0.520 mg/kg and 0.989 to 1.575 mg/kg in benzoate and sorbate, respectively. The correlation coefficient of each calibration curve was mostly higher than 0.997. All samples contained sodium benzoate in various ranges. Natamycin and sorbate were detected in a remarkable amount of samples, while, according to Iranian national standard, only sorbate is permitted to be added in processed cheeses as a preservative. In order to control the quality of dairy products, determination of preservatives is necessary. PMID:28473855

  2. Interpretation of long- and short-wavelength magnetic anomalies

    USGS Publications Warehouse

    DeNoyer, John M.; Barringer, Anthony R.

    1980-01-01

    Magset was launched on October 30, 1979. More than a decade of examining existing data, devising appropriate models of the global magnetic field, and extending methods for interpreting long-wavelength magnetic anomalies preceded this launch Magnetic data collected by satellite can be interrupted by using a method of analysis that quantitively describes the magnetic field resulting from three-dimensional geologic structures that are bounded by an arbitrary number of polygonal faces, Each face my have any orientation and three or more sides. At each point of the external field, the component normal to each face is obtained by using an expression for the solid angle subtended by a generalized polygon. The "cross" of tangential components are relatively easy to obtain for the same polygons. No approximations have been made related to orbit height that restrict the dimensions of the polygons relative to the distance from the external field points. This permits the method to be used to model shorter wavelength anomalies obtained from aircraft or ground surveys. The magnetic fields for all the structures considered are determine in the same rectangular coordinate system. The coordinate system is in depended from the orientation of geologic trends and permits multiple structures or bodies to be included in the same magnetic field calculations. This single reference system also simplified adjustments in position and direction to account for earth curvature in regional interpretation.

  3. Programmable living material containing reporter micro-organisms permits quantitative detection of oligosaccharides.

    PubMed

    Mora, Carlos A; Herzog, Antoine F; Raso, Renzo A; Stark, Wendelin J

    2015-08-01

    The increasing molecular understanding of many diseases today permits the development of new diagnostic methods. However, few easy-to-handle and inexpensive tools exist for common diseases such as food disorders. Here we present a living material based analytical sensor (LiMBAS) containing genetically modified bacteria (Escherichia coli) immobilized and protected in a thin layer between a nanoporous and support polymer membrane for a facile quantification of disease-relevant oligosaccharides. The bacteria were engineered to fluoresce in response to the analyte to reveal its diffusion behavior when using a blue-light source and optical filter. We demonstrated that the diffusion zone diameter was related semi-logarithmically to the analyte concentration. LiMBAS could accurately quantify lactose or galactose in undiluted food samples and was able to measure food intolerance relevant concentrations in the range of 1-1000 mM requiring a sample volume of 1-10 μL. LiMBAS was storable for at least seven days without losing functionality at 4 °C. A wide range of genetic tools for E. coli are readily available thus allowing the reprogramming of the material to serve as biosensor for other molecules. In combination with smartphones, an automated diagnostic analysis becomes feasible which would also allow untrained people to use LiMBAS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. An ecological framework for informing permitting decisions on scientific activities in protected areas

    PubMed Central

    Saarman, Emily T.; Owens, Brian; Murray, Steven N.; Weisberg, Stephen B.; Field, John C.; Nielsen, Karina J.

    2018-01-01

    There are numerous reasons to conduct scientific research within protected areas, but research activities may also negatively impact organisms and habitats, and thus conflict with a protected area’s conservation goals. We developed a quantitative ecological decision-support framework that estimates these potential impacts so managers can weigh costs and benefits of proposed research projects and make informed permitting decisions. The framework generates quantitative estimates of the ecological impacts of the project and the cumulative impacts of the proposed project and all other projects in the protected area, and then compares the estimated cumulative impacts of all projects with policy-based acceptable impact thresholds. We use a series of simplified equations (models) to assess the impacts of proposed research to: a) the population of any targeted species, b) the major ecological assemblages that make up the community, and c) the physical habitat that supports protected area biota. These models consider both targeted and incidental impacts to the ecosystem and include consideration of the vulnerability of targeted species, assemblages, and habitats, based on their recovery time and ecological role. We parameterized the models for a wide variety of potential research activities that regularly occur in the study area using a combination of literature review and expert judgment with a precautionary approach to uncertainty. We also conducted sensitivity analyses to examine the relationships between model input parameters and estimated impacts to understand the dominant drivers of the ecological impact estimates. Although the decision-support framework was designed for and adopted by the California Department of Fish and Wildlife for permitting scientific studies in the state-wide network of marine protected areas (MPAs), the framework can readily be adapted for terrestrial and freshwater protected areas. PMID:29920527

  5. 77 FR 75429 - Notice of Availability of Proposed National Pollutant Discharge Elimination System (NPDES...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-20

    ... produced water. These changes are discussed in more detail below, and in the fact sheet accompanying the... part, the proposed permit is very similar to the 2004 permit. The major changes from the 2004 permit... limits and monitoring requirements for produced water based on an updated reasonable potential analysis...

  6. 78 FR 70075 - Notice of Permit Applications Received Under the Antarctic Conservation Act of 1978 (Public Law...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... establishment of a permit system for various activities in Antarctica and designation of certain animals and certain geographic areas a requiring special protection. The regulations establish such a permit system to..., stable isotope analysis, and DNA extraction. Data would be used to reconstruct seal population dynamics...

  7. [Access to health care for an induced abortion: qualitative and quantitative approaches].

    PubMed

    Bajos, N; Moreau, C; Ferrand, M; Bouyer, J

    2003-12-01

    Despite recent studies showing evidence that the organisation of the French health care system raises some problems concerning access to abortion, far little is known on the reality of access conditions and the views of women on the difficulties they experience when they attend an abortion clinic. In this article, we discuss the complementarity of materials from two surveys one qualitative, the other quantitative in the study of patterns of care for an abortion. The qualitative survey included 51 women who reported a history of induced abortion, selected from a qualitative study on unintended pregnancy in France. The quantitative survey included 480 women, who had an abortion in the past 10 years. These women were selected from a representative sample of 2863 women aged 18 to 44, who participated in a study on contraception and abortion. The variety of patterns of care for an abortion, the rareness of dysfunctions in the health care system and the importance of the first professional women contacted, demonstrated in the qualitative survey, were confirmed in the quantitative survey. The quantitative survey enabled quantifying the distribution of the different patterns of care. It also permitted to identify factors associated with the choice of first professional contacted and with the type of subsequent patterns of care. The qualitative survey permitted to explore these patterns of care and to highlight the interaction between the women's request and the representation of the legitimacy of their request. Difficulties of access seemed to be linked to the lack of support women experienced in the process of finding an abortion clinic. Results suggest that general practitioners are less well informed of the procedures required for an abortion than other professionals. However, the qualitative survey also shows that problems of access cannot be reduced to the lack of information of professionals, as their practice was also linked to their own representation of abortion, and their perception of the legitimacy of the women's request. Our results underline the need for the definition of a clear health policy that should include two priorities: the improvement of the visibility of health care supply for an abortion and the promotion of information delivered to health care professionals.

  8. Financial impact of two different ways of evaluating early virological response to peginterferon-alpha-2b plus ribavirin therapy in treatment-naive patients with chronic hepatitis C virus genotype 1.

    PubMed

    Buti, Maria; Casado, Miguel A; Fosbrook, Leslie; Esteban, Rafael

    2005-01-01

    Patients infected with chronic hepatitis C virus (HCV) genotype 1 are the least responsive to peginterferon (pegIFN) and ribavirin therapy. The monitoring of early virological response (EVR) is therefore an important tool for quickly identifying non-responders, permitting therapy discontinuation and avoiding adverse effects and costs. To analyse the financial impact, in treatment-naive patients infected with HCV genotype 1, of two different measurement techniques for evaluating the EVR during pegIFN-alpha-2b plus ribavirin therapy, and to compare the results of a 48-week standard course of therapy with pegIFN-alpha-2b plus ribavirin without measuring EVR. A budget impact model was constructed using a decision-tree analysis. EVR was defined as a >2 log decline in HCV RNA levels at week 12 either tested with two quantitative HCV RNA tests or undetectable HCV core antigen (HCV core Ag) protein levels at week 12 (one HCV core Ag test). Clinical data were taken from multicentre trials and costs from the published literature (euro, 2003 values). The analysis was carried out from the perspective of the Spanish healthcare system and therefore only direct costs were considered. The base-case scenario assumed that a potential study population of 18,504 people in Spain with chronic HCV genotype 1 would be eligible for treatment with pegIFN-alpha-2b plus ribavirin. In the base case, the most effective strategy was testing EVR by HCV core Ag. This resulted in 12,745 patients reaching a sustained virological response (SVR) at an overall cost of 243.98 million euro (19,142 euro per SVR). Conversely, quantitative HCV RNA testing resulted in 11,776 patients with an SVR at a cost of 232.73 million euro ( 19,763 euro per SVR). The incremental cost per successfully treated patient with HCV core Ag testing versus quantitative HCV RNA testing was 11,597 euro. One-way sensitivity analyses demonstrated that changes in the study parameters did not modify the outcomes, except when increasing the EVR or SVR of strategy 2 or when decreasing the EVR or SVR of strategy 3. This model suggests, with its underlying assumptions and data, that the assessment of EVR at week 12 by HCV core Ag testing in chronic HCV patients infected with genotype 1 permits identification of those patients expected to achieve an SVR with pegIFN-alpha-2b and ribavirin, resulting in a lower overall cost to the Spanish healthcare system than HCV RNA testing or no testing at all.

  9. Quantitative dual-probe microdialysis: mathematical model and analysis.

    PubMed

    Chen, Kevin C; Höistad, Malin; Kehr, Jan; Fuxe, Kjell; Nicholson, Charles

    2002-04-01

    Steady-state microdialysis is a widely used technique to monitor the concentration changes and distributions of substances in tissues. To obtain more information about brain tissue properties from microdialysis, a dual-probe approach was applied to infuse and sample the radiotracer, [3H]mannitol, simultaneously both in agar gel and in the rat striatum. Because the molecules released by one probe and collected by the other must diffuse through the interstitial space, the concentration profile exhibits dynamic behavior that permits the assessment of the diffusion characteristics in the brain extracellular space and the clearance characteristics. In this paper a mathematical model for dual-probe microdialysis was developed to study brain interstitial diffusion and clearance processes. Theoretical expressions for the spatial distribution of the infused tracer in the brain extracellular space and the temporal concentration at the probe outlet were derived. A fitting program was developed using the simplex algorithm, which finds local minima of the standard deviations between experiments and theory by adjusting the relevant parameters. The theoretical curves accurately fitted the experimental data and generated realistic diffusion parameters, implying that the mathematical model is capable of predicting the interstitial diffusion behavior of [3H]mannitol and that it will be a valuable quantitative tool in dual-probe microdialysis.

  10. Placebos in 19th century medicine: a quantitative analysis of the BMJ.

    PubMed

    Raicek, Jacqueline E; Stone, Bradley H; Kaptchuk, Ted J

    2012-12-18

    To provide the first quantitative data on the use of the term "placebo" in the 19th century. Computer search of BMJ's archival database from January 1840 (the first issue) through December 1899 for uses of the words "placebo(s)." Grounded theory was used to categorise the implications of uses of the term. 71 citations contained the term "placebo(s)." Of these, 22 (31%) used the term to mean "no effect" or as a general pejorative term, 18 (25%) portrayed placebo treatment as permitting the unfolding of the natural history (the normal waxing and waning of illness), 14 (20%) described placebo as important to satisfy patients, 7 (10%) described it as fulfilling a physician's performance role, 3 (4%) described its use to buy time, 3 (4%) described its use for financial gain, 2 (3%) used it in a manner similar to a placebo control, and only one implied that placebo could have a clinical effect. Only one citation mentioned telling the patient about his placebo treatment. Nineteenth century physicians had diverse a priori assumptions about placebos. These findings remind us that contemporary medicine needs to use rigorous science to separate fact from its own beliefs concerning the "provision of care." As in previous generations, ethical issues concerning placebos continue to challenge medicine.

  11. Quantitative imaging for discovery and assembly of the metabo-regulome

    PubMed Central

    Okumoto, Sakiko; Takanaga, Hitomi; Frommer, Wolf B.

    2009-01-01

    Summary Little is known about regulatory networks that control metabolic flux in plant cells. Detailed understanding of regulation is crucial for synthetic biology. The difficulty of measuring metabolites with cellular and subcellular precision is a major roadblock. New tools have been developed for monitoring extracellular, cytosolic, organellar and vacuolar ion and metabolite concentrations with a time resolution of milliseconds to hours. Genetically encoded sensors allow quantitative measurement of steady-state concentrations of ions, signaling molecules and metabolites and their respective changes over time. Fluorescence resonance energy transfer (FRET) sensors exploit conformational changes in polypeptides as a proxy for analyte concentrations. Subtle effects of analyte binding on the conformation of the recognition element are translated into a FRET change between two fused green fluorescent protein (GFP) variants, enabling simple monitoring of analyte concentrations using fluorimetry or fluorescence microscopy. Fluorimetry provides information averaged over cell populations, while microscopy detects differences between cells or populations of cells. The genetically encoded sensors can be targeted to subcellular compartments or the cell surface. Confocal microscopy ultimately permits observation of gradients or local differences within a compartment. The FRET assays can be adapted to high-throughput analysis to screen mutant populations in order to systematically identify signaling networks that control individual steps in metabolic flux. PMID:19138219

  12. Late paleozoic fusulinoidean gigantism driven by atmospheric hyperoxia.

    PubMed

    Payne, Jonathan L; Groves, John R; Jost, Adam B; Nguyen, Thienan; Moffitt, Sarah E; Hill, Tessa M; Skotheim, Jan M

    2012-09-01

    Atmospheric hyperoxia, with pO(2) in excess of 30%, has long been hypothesized to account for late Paleozoic (360-250 million years ago) gigantism in numerous higher taxa. However, this hypothesis has not been evaluated statistically because comprehensive size data have not been compiled previously at sufficient temporal resolution to permit quantitative analysis. In this study, we test the hyperoxia-gigantism hypothesis by examining the fossil record of fusulinoidean foraminifers, a dramatic example of protistan gigantism with some individuals exceeding 10 cm in length and exceeding their relatives by six orders of magnitude in biovolume. We assembled and examined comprehensive regional and global, species-level datasets containing 270 and 1823 species, respectively. A statistical model of size evolution forced by atmospheric pO(2) is conclusively favored over alternative models based on random walks or a constant tendency toward size increase. Moreover, the ratios of volume to surface area in the largest fusulinoideans are consistent in magnitude and trend with a mathematical model based on oxygen transport limitation. We further validate the hyperoxia-gigantism model through an examination of modern foraminiferal species living along a measured gradient in oxygen concentration. These findings provide the first quantitative confirmation of a direct connection between Paleozoic gigantism and atmospheric hyperoxia. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.

  13. A PCR method for the detection and differentiation of Lentinus edodes and Trametes versicolor in defined-mixed cultures used for wastewater treatment.

    PubMed

    García-Mena, Jaime; Cano-Ramirez, Claudia; Garibay-Orijel, Claudio; Ramirez-Canseco, Sergio; Poggi-Varaldo, Héctor M

    2005-06-01

    A PCR-based method for the quantitative detection of Lentinus edodes and Trametes versicolor, two ligninolytic fungi applied for wastewater treatment and bioremediation, was developed. Genomic DNA was used to optimize a PCR method targeting the conserved copper-binding sequence of laccase genes. The method allowed the quantitative detection and differentiation of these fungi in single and defined-mixed cultures after fractionation of the PCR products by electrophoresis in agarose gels. Amplified products of about 150 bp for L. edodes, and about 200 bp for T. versicolor were purified and cloned. The PCR method showed a linear detection response in the 1.0 microg-1 ng range. The same method was tested with genomic DNA from a third fungus (Phanerochaete chrysosporium), yielding a fragment of about 400 bp. Southern-blot and DNA sequence analysis indicated that a specific PCR product was amplified from each genome, and that these corresponded to sequences of laccase genes. This PCR protocol permits the detection and differentiation of three ligninolytic fungi by amplifying DNA fragments of different sizes using a single pair of primers, without further enzymatic restriction of the PCR products. This method has potential use in the monitoring, evaluation, and improvement of fungal cultures used in wastewater treatment processes.

  14. Factors associated with the patient safety climate at a teaching hospital1

    PubMed Central

    Luiz, Raíssa Bianca; Simões, Ana Lúcia de Assis; Barichello, Elizabeth; Barbosa, Maria Helena

    2015-01-01

    Objectives: to investigate the association between the scores of the patient safety climate and socio-demographic and professional variables. Methods: an observational, sectional and quantitative study, conducted at a large public teaching hospital. The Safety Attitudes Questionnaire was used, translated and validated for Brazil. Data analysis used the software Statistical Package for Social Sciences. In the bivariate analysis, we used Student's t-test, analysis of variance and Spearman's correlation of (α=0.05). To identify predictors for the safety climate scores, multiple linear regression was used, having the safety climate domain as the main outcome (α=0.01). Results: most participants were women, nursing staff, who worked in direct care to adult patients in critical areas, without a graduate degree and without any other employment. The average and median total score of the instrument corresponded to 61.8 (SD=13.7) and 63.3, respectively. The variable professional performance was found as a factor associated with the safety environment for the domain perception of service management and hospital management (p=0.01). Conclusion: the identification of factors associated with the safety environment permits the construction of strategies for safe practices in the hospitals. PMID:26487138

  15. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  16. Stable isotope analysis of molecular oxygen from silicates and oxides using CO2 laser extraction

    NASA Technical Reports Server (NTRS)

    Perry, Eugene

    1996-01-01

    A laser-excited system for determination of the oxygen isotope composition of small quantities of silicate and oxide minerals was constructed and tested at JSC. This device is the first reported to use a commercially available helium cryostat to transfer and purify oxygen gas quantitatively within the system. The system uses oxygen gas instead of the conventional CO2 for mass spectrometer analyses. This modification of technique permits determination of all three stable oxygen isotopes, an essential requirement for oxygen isotope analysis of meteoritic material. Tests of the system included analysis of standard silicate materials NBS 28 and UWMG2 garnet, six SNC meteorites, and inclusions and chondrules from the Allende meteorite. Calibration with terrestrial standards was excellent. Meteorite values are close to published values and show no evidence of terrestrial oxygen contamination. The one limitation observed is that, in some runs on fine-grained SNC matrix material, sample results were affected by other samples in the sample holder within the reaction chamber. This reemphasizes the need for special precautions in dealing with fine-grained, reactive samples. Performance of the JSC instrument compares favorably with that of any other instrument currently producing published oxygen isotope data.

  17. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  18. Integrating service development with evaluation in telehealthcare: an ethnographic study.

    PubMed

    Finch, Tracy; May, Carl; Mair, Frances; Mort, Maggie; Gask, Linda

    2003-11-22

    To identify issues that facilitate the successful integration of evaluation and development of telehealthcare services. Ethnographic study using various qualitative research techniques to obtain data from several sources, including in-depth semistructured interviews, project steering group meetings, and public telehealthcare meetings. Seven telehealthcare evaluation projects (four randomised controlled trials and three pragmatic service evaluations) in the United Kingdom, studied over two years. Projects spanned a range of specialties-dermatology, psychiatry, respiratory medicine, cardiology, and oncology. Clinicians, managers, technical experts, and researchers involved in the projects. Key problems in successfully integrating evaluation and service development in telehealthcare are, firstly, defining existing clinical practices (and anticipating changes) in ways that permit measurement; secondly, managing additional workload and conflicting responsibilities brought about by combining clinical and research responsibilities (including managing risk); and, thirdly, understanding various perspectives on effectiveness and the limitations of evaluation results beyond the context of the research study. Combined implementation and evaluation of telehealthcare systems is complex, and is often underestimated. The distinction between quantitative outcomes and the workability of the system is important for producing evaluative knowledge that is of practical value. More pragmatic approaches to evaluation, that permit both quantitative and qualitative methods, are required to improve the quality of such research and its relevance for service provision in the NHS.

  19. Doctoral training in statistics, measurement, and methodology in psychology: replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.

  20. Quantitative characterization of color Doppler images: reproducibility, accuracy, and limitations.

    PubMed

    Delorme, S; Weisser, G; Zuna, I; Fein, M; Lorenz, A; van Kaick, G

    1995-01-01

    A computer-based quantitative analysis for color Doppler images of complex vascular formations is presented. The red-green-blue-signal from an Acuson XP10 is frame-grabbed and digitized. By matching each image pixel with the color bar, color pixels are identified and assigned to the corresponding flow velocity (color value). Data analysis consists of delineation of a region of interest and calculation of the relative number of color pixels in this region (color pixel density) as well as the mean color value. The mean color value was compared to flow velocities in a flow phantom. The thyroid and carotid artery in a volunteer were repeatedly examined by a single examiner to assess intra-observer variability. The thyroids in five healthy controls were examined by three experienced physicians to assess the extent of inter-observer variability and observer bias. The correlation between the mean color value and flow velocity ranged from 0.94 to 0.96 for a range of velocities determined by pulse repetition frequency. The average deviation of the mean color value from the flow velocity was 22% to 41%, depending on the selected pulse repetition frequency (range of deviations, -46% to +66%). Flow velocity was underestimated with inadequately low pulse repetition frequency, or inadequately high reject threshold. An overestimation occurred with inadequately high pulse repetition frequency. The highest intra-observer variability was 22% (relative standard deviation) for the color pixel density, and 9.1% for the mean color value. The inter-observer variation was approximately 30% for the color pixel density, and 20% for the mean color value. In conclusion, computer assisted image analysis permits an objective description of color Doppler images. However, the user must be aware that image acquisition under in vivo conditions as well as physical and instrumental factors may considerably influence the results.

  1. Chromium localization in plant tissues of Lycopersicum esculentum Mill using ICP-MS and ion microscopy (SIMS)

    NASA Astrophysics Data System (ADS)

    Mangabeira, Pedro Antonio; Gavrilov, Konstantin L.; Almeida, Alex-Alan Furtado de; Oliveira, Arno Heeren; Severo, Maria Isabel; Rosa, Tiago Santana; Silva, Delmira da Costa; Labejof, Lise; Escaig, Françoise; Levi-Setti, Riccardo; Mielke, Marcelo Schramm; Loustalot, Florence Grenier; Galle, Pierre

    2006-03-01

    High-resolution imaging secondary ion mass spectrometry (HRI-SIMS) in combination with inductively coupled plasma mass spectrometry (ICP-MS) were utilised to determine specific sites of chromium concentration in tomato plant tissues (roots, stems and leaves). The tissues were obtained from plants grown for 2 months in hydroponic conditions with Cr added in a form chromium salt (CrCl 3·6H 2O) to concentrations of 25 and 50 mg/L. The chemical fixation procedure used permit to localize only insoluble or strongly bound Cr components in tomato plant tissue. In this work no quantitative SIMS analysis was made. HRI-SIMS analysis revealed that the transport of chromium is restricted to the vascular system of roots, stems and leaves. No Cr was detected in epidermis, palisade parenchyma and spongy parenchyma cells of the leaves. The SIMS-300 spectra obtained from the tissues confirm the HRI-SIMS observations. The roots, and especially walls of xylem vessels, were determined as the principal site of chromium accumulation in tomato plants.

  2. Adhesion, friction, wear, and lubrication research by modern surface science techniques.

    NASA Technical Reports Server (NTRS)

    Keller, D. V., Jr.

    1972-01-01

    The field of surface science has undergone intense revitalization with the introduction of low-energy electron diffraction, Auger electron spectroscopy, ellipsometry, and other surface analytical techniques which have been sophisticated within the last decade. These developments have permitted submono- and monolayer structure analysis as well as chemical identification and quantitative analysis. The application of a number of these techniques to the solution of problems in the fields of friction, lubrication, and wear are examined in detail for the particular case of iron; and in general to illustrate how the accumulation of pure data will contribute toward the establishment of physiochemical concepts which are required to understand the mechanisms that are operational in friction systems. In the case of iron, LEED, Auger and microcontact studies have established that hydrogen and light-saturated organic vapors do not establish interfaces which prevent iron from welding, whereas oxygen and some oxygen and sulfur compounds do reduce welding as well as the coefficient of friction. Interpretation of these data suggests a mechanism of sulfur interaction in lubricating systems.

  3. Quantitation of polycyclic aromatic hydrocarbons (PAH4) in cocoa and chocolate samples by an HPLC-FD method.

    PubMed

    Raters, Marion; Matissek, Reinhard

    2014-11-05

    As a consequence of the PAH4 (sum of four different polycyclic aromatic hydrocarbons, named benzo[a]anthracene, chrysene, benzo[b]fluoranthene, and benzo[a]pyrene) maximum levels permitted in cocoa beans and derived products as of 2013, an high-performance liquid chromatography with fluorescence detection method (HPLC-FD) was developed and adapted to the complex cocoa butter matrix to enable a simultaneous determination of PAH4. The resulting analysis method was subsequently successfully validated. This method meets the requirements of Regulation (EU) No. 836/2011 regarding analysis methods criteria for determining PAH4 and is hence most suitable for monitoring the observance of the maximum levels applicable under Regulation (EU) No. 835/2011. Within the scope of this work, a total of 218 samples of raw cocoa, cocoa masses, and cocoa butter from several sample years (1999-2012), of various origins and treatments, as well as cocoa and chocolate products were analyzed for the occurrence of PAH4. In summary, it is noted that the current PAH contamination level of cocoa products can be deemed very slight overall.

  4. Polynomial Conjoint Analysis of Similarities: A Model for Constructing Polynomial Conjoint Measurement Algorithms.

    ERIC Educational Resources Information Center

    Young, Forrest W.

    A model permitting construction of algorithms for the polynomial conjoint analysis of similarities is presented. This model, which is based on concepts used in nonmetric scaling, permits one to obtain the best approximate solution. The concepts used to construct nonmetric scaling algorithms are reviewed. Finally, examples of algorithmic models for…

  5. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  6. Xrcc1-dependent and Ku-dependent DNA double-strand break repair kinetics in Arabidopsis plants.

    PubMed

    Charbonnel, Cyril; Gallego, Maria E; White, Charles I

    2010-10-01

    Double-strand breakage (DSB) of DNA involves loss of information on the two strands of the DNA fibre and thus cannot be repaired by simple copying of the complementary strand which is possible with single-strand DNA damage. Homologous recombination (HR) can precisely repair DSB using another copy of the genome as template and non-homologous recombination (NHR) permits repair of DSB with little or no dependence on DNA sequence homology. In addition to the well-characterised Ku-dependent non-homologous end-joining (NHEJ) pathway, much recent attention has been focused on Ku-independent NHR. The complex interrelationships and regulation of NHR pathways remain poorly understood, even more so in the case of plants, and we present here an analysis of Ku-dependent and Ku-independent repair of DSB in Arabidopsis thaliana. We have characterised an Arabidopsis xrcc1 mutant and developed quantitative analysis of the kinetics of appearance and loss of γ-H2AX foci as a tool to measure DSB repair in dividing root tip cells of γ-irradiated plants in vivo. This approach has permitted determination of DSB repair kinetics in planta following a short pulse of γ-irradiation, establishing the existence of a Ku-independent, Xrcc1-dependent DSB repair pathway. Furthermore, our data show a role for Ku80 during the first minutes post-irradiation and that Xrcc1 also plays such a role, but only in the absence of Ku. The importance of Xrcc1 is, however, clearly visible at later times in the presence of Ku, showing that alternative end-joining plays an important role in DSB repair even in the presence of active NHEJ. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.

  7. Application of Nursing Process and Its Affecting Factors among Nurses Working in Mekelle Zone Hospitals, Northern Ethiopia

    PubMed Central

    Hagos, Fisseha; Alemseged, Fessehaye; Balcha, Fikadu; Berhe, Semarya; Aregay, Alemseged

    2014-01-01

    Background. Nursing process is considered as appropriate method to explain the nursing essence, its scientific bases, technologies and humanist assumptions that encourage critical thinking and creativity, and permits solving problems in professional practice. Objective. To assess the application of nursing process and it's affecting factors in Mekelle Zone Hospitals. Methods. A cross sectional design employing quantitative and qualitative methods was conducted in Mekelle zone hospitals March 2011. Qualitative data was collected from14 head nurses of six hospitals and quantitative was collected from 200 nurses selected by simple random sampling technique from the six hospitals proportional to their size. SPSS version 16.1 and thematic analysis was used for quantitative and qualitative data respectively. Results. Majority 180 (90%) of the respondents have poor knowledge and 99.5% of the respondents have a positive attitude towards the nursing process. All of the respondents said that they did not use the nursing process during provision of care to their patients at the time of the study. Majority (75%) of the respondent said that the nurse to patient ratio was not optimal to apply the nursing process. Conclusion and Recommendation. The nursing process is not yet applied in all of the six hospitals. The finding revealed that the knowledge of nurses on the nursing process is not adequate to put it in to practice and high patient nurse ratio affects its application. The studied hospitals should consider the application of the nursing process critically by motivating nurses and monitor and evaluate its progress. PMID:24649360

  8. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  9. 4-D spatiotemporal analysis of ultrasound contrast agent dispersion for prostate cancer localization: a feasibility study.

    PubMed

    Schalk, Stefan G; Demi, Libertario; Smeenge, Martijn; Mills, David M; Wallace, Kirk D; de la Rosette, Jean J M C H; Wijkstra, Hessel; Mischi, Massimo

    2015-05-01

    Currently, nonradical treatment for prostate cancer is hampered by the lack of reliable diagnostics. Contrastultrasound dispersion imaging (CUDI) has recently shown great potential as a prostate cancer imaging technique. CUDI estimates the local dispersion of intravenously injected contrast agents, imaged by transrectal dynamic contrast-enhanced ultrasound (DCE-US), to detect angiogenic processes related to tumor growth. The best CUDI results have so far been obtained by similarity analysis of the contrast kinetics in neighboring pixels. To date, CUDI has been investigated in 2-D only. In this paper, an implementation of 3-D CUDI based on spatiotemporal similarity analysis of 4-D DCE-US is described. Different from 2-D methods, 3-D CUDI permits analysis of the entire prostate using a single injection of contrast agent. To perform 3-D CUDI, a new strategy was designed to estimate the similarity in the contrast kinetics at each voxel, and data processing steps were adjusted to the characteristics of 4-D DCE-US images. The technical feasibility of 4-D DCE-US in 3-D CUDI was assessed and confirmed. Additionally, in a preliminary validation in two patients, dispersion maps by 3-D CUDI were quantitatively compared with those by 2-D CUDI and with 12-core systematic biopsies with promising results.

  10. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  11. Automated sample preparation using membrane microtiter extraction for bioanalytical mass spectrometry.

    PubMed

    Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H

    1997-01-01

    The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.

  12. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data

    PubMed Central

    Jang, Min Jee; Nam, Yoonkey

    2015-01-01

    Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973

  13. Guidelines for improving the reproducibility of quantitative multiparameter immunofluorescence measurements by laser scanning cytometry on fixed cell suspensions from human solid tumors.

    PubMed

    Shackney, Stanley; Emlet, David R; Pollice, Agnese; Smith, Charles; Brown, Kathryn; Kociban, Deborah

    2006-01-01

    Laser scanning Cytometry (LSC) is a versatile technology that makes it possible to perform multiple measurements on individual cells and correlate them cell by cell with other cellular features. It would be highly desirable to be able to perform reproducible, quantitative, correlated cell-based immunofluorescence studies on individual cells from human solid tumors. However, such studies can be challenging because of the presence of large numbers of cell aggregates and other confounding factors. Techniques have been developed to deal with cell aggregates in data sets collected by LSC. Experience has also been gained in addressing other key technical and methodological issues that can affect the reproducibility of such cell-based immunofluorescence measurements. We describe practical aspects of cell sample collection, cell fixation and staining, protocols for performing multiparameter immunofluorescence measurements by LSC, use of controls and reference samples, and approaches to data analysis that we have found useful in improving the accuracy and reproducibility of LSC data obtained in human tumor samples. We provide examples of the potential advantages of LSC in examining quantitative aspects of cell-based analysis. Improvements in the quality of cell-based multiparameter immunofluorescence measurements make it possible to extract useful information from relatively small numbers of cells. This, in turn, permits the performance of multiple multicolor panels on each tumor sample. With links among the different panels that are provided by overlapping measurements, it is possible to develop increasingly more extensive profiles of intracellular expression of multiple proteins in clinical samples of human solid tumors. Examples of such linked panels of measurements are provided. Advances in methodology can improve cell-based multiparameter immunofluorescence measurements on cell suspensions from human solid tumors by LSC for use in prognostic and predictive clinical applications. Copyright (c) 2005 Wiley-Liss, Inc.

  14. Improved phase sensitivity in spectral domain phase microscopy using line-field illumination and self phase-referencing

    PubMed Central

    Yaqoob, Zahid; Choi, Wonshik; Oh, Seungeun; Lue, Niyom; Park, Yongkeun; Fang-Yen, Christopher; Dasari, Ramachandra R.; Badizadegan, Kamran; Feld, Michael S.

    2010-01-01

    We report a quantitative phase microscope based on spectral domain optical coherence tomography and line-field illumination. The line illumination allows self phase-referencing method to reject common-mode phase noise. The quantitative phase microscope also features a separate reference arm, permitting the use of high numerical aperture (NA > 1) microscope objectives for high resolution phase measurement at multiple points along the line of illumination. We demonstrate that the path-length sensitivity of the instrument can be as good as 41 pm/Hz, which makes it suitable for nanometer scale study of cell motility. We present the detection of natural motions of cell surface and two-dimensional surface profiling of a HeLa cell. PMID:19550464

  15. Positron emission tomography

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y. Lucas; Thompson, Christopher J.; Diksic, Mirko; Meyer, Ernest; Feindel, William H.

    One of the most exciting new technologies introduced in the last 10 yr is positron emission tomography (PET). PET provides quantitative, three-dimensional images for the study of specific biochemical and physiological processes in the human body. This approach is analogous to quantitative in-vivo autoradiography but has the added advantage of permitting non-invasive in vivo studies. PET scanning requires a small cyclotron to produce short-lived positron emitting isotopes such as oxygen-15, carbon-11, nitrogen-13 and fluorine-18. Proper radiochemical facilities and advanced computer equipment are also needed. Most important, PET requires a multidisciplinary scientific team of physicists, radiochemists, mathematicians, biochemists and physicians. This review analyzes the most recent trends in the imaging technology, radiochemistry, methodology and clinical applications of positron emission tomography.

  16. Alternatives to the Randomized Controlled Trial

    PubMed Central

    West, Stephen G.; Duan, Naihua; Pequegnat, Willo; Gaist, Paul; Des Jarlais, Don C.; Holtgrave, David; Szapocznik, José; Fishbein, Martin; Rapkin, Bruce; Clatts, Michael; Mullen, Patricia Dolan

    2008-01-01

    Public health researchers are addressing new research questions (e.g., effects of environmental tobacco smoke, Hurricane Katrina) for which the randomized controlled trial (RCT) may not be a feasible option. Drawing on the potential outcomes framework (Rubin Causal Model) and Campbellian perspectives, we consider alternative research designs that permit relatively strong causal inferences. In randomized encouragement designs, participants are randomly invited to participate in one of the treatment conditions, but are allowed to decide whether to receive treatment. In quantitative assignment designs, treatment is assigned on the basis of a quantitative measure (e.g., need, merit, risk). In observational studies, treatment assignment is unknown and presumed to be nonrandom. Major threats to the validity of each design and statistical strategies for mitigating those threats are presented. PMID:18556609

  17. Kinetics of Water Loss from Cells at Subzero Temperatures and the Likelihood of Intracellular Freezing

    PubMed Central

    Mazur, Peter

    1963-01-01

    The survival of various cells subjected to low temperature exposure is higher when they are cooled slowly. This increase is consistent with the view that slow cooling decreases the probability of intracellular freezing by permitting water to leave the cell rapidly enough to keep the protoplasm at its freezing point. The present study derives a quantitative relation between the amount of water in a cell and temperature. The relation is a differential equation involving cooling rate, surface-volume ratio, membrane permeability to water, and the temperature coefficient of the permeability constant. Numerical solutions to this equation give calculated water contents which permit predictions as to the likelihood of intracellular ice formation. Both the calculated water contents and the predictions on internal freezing are consistent with the experimental observations of several investigators. PMID:14085017

  18. Selective functionalization of the mesopores of SBA-15

    DOE PAGES

    Webb, Jonathan D.; Seki, Tomohiro; Goldston, Jennifer F.; ...

    2014-10-23

    In this study, a method has been developed that permits the highly selective functionalization of the interior and exterior surfaces of the ubiquitous mesoporous material, SBA-15. The key step is reloading the as-synthesized material with structure-directing agent, Pluronic ® P123, prior to selective functionalization of the external surface with a silylating agent. This new approach represents a significant improvement over literature procedures. Results from physisorption analyses as well as solid-state NMR permit a detailed, quantitative assessment of functionalized SBA-15. This work also provides insight into the stability of the silyl layer during extraction procedures – an issue often neglected inmore » other studies but of significant importance as decomposition of this layer could result in the introduction of new silanols and reduce the effectiveness of any selective grafting procedure.« less

  19. Water balance model for Kings Creek

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.

    1990-01-01

    Particular attention is given to the spatial variability that affects the representation of water balance at the catchment scale in the context of macroscale water-balance modeling. Remotely sensed data are employed for parameterization, and the resulting model is developed so that subgrid spatial variability is preserved and therefore influences the grid-scale fluxes of the model. The model permits the quantitative evaluation of the surface-atmospheric interactions related to the large-scale hydrologic water balance.

  20. Rapid and inexpensive body fluid identification by RNA profiling-based multiplex High Resolution Melt (HRM) analysis

    PubMed Central

    Hanson, Erin K.; Ballantyne, Jack

    2014-01-01

    Positive identification of the nature of biological material present on evidentiary items can be crucial for understanding the circumstances surrounding a crime. However, traditional protein-based methods do not permit the identification of all body fluids and tissues, and thus molecular based strategies for the conclusive identification of all forensically relevant biological fluids and tissues need to be developed. Messenger RNA (mRNA) profiling is an example of such a molecular-based approach. Current mRNA body fluid identification assays involve capillary electrophoresis (CE) or quantitative RT-PCR (qRT-PCR) platforms, each with its own limitations. Both platforms require the use of expensive fluorescently labeled primers or probes. CE-based assays require separate amplification and detection steps thus increasing the analysis time. For qRT-PCR assays, only 3-4 markers can be included in a single reaction since each requires a different fluorescent dye. To simplify mRNA profiling assays, and reduce the time and cost of analysis, we have developed single- and multiplex body fluid High Resolution Melt (HRM) assays for the identification of common forensically relevant biological fluids and tissues. The incorporated biomarkers include IL19 (vaginal secretions), IL1F7 (skin), ALAS2 (blood), MMP10 (menstrual blood), HTN3 (saliva) and TGM4 (semen).  The HRM assays require only unlabeled PCR primers and a single saturating intercalating fluorescent dye (Eva Green). Each body-fluid-specific marker can easily be identified by the presence of a distinct melt peak. Usually, HRM assays are used to detect variants or isoforms for a single gene target. However, we have uniquely developed duplex and triplex HRM assays to permit the simultaneous detection of multiple targets per reaction. Here we describe the development and initial performance evaluation of the developed HRM assays. The results demonstrate the potential use of HRM assays for rapid, and relatively inexpensive, screening of biological evidence. PMID:24715968

  1. [Computer-assisted analysis of the results of training in internal medicine].

    PubMed

    Vrbová, H; Spunda, M

    1991-06-01

    Analysis of the results of teaching of clinical disciplines has in the long run an impact on the standard and value of medical care. It requires processing of quantitative and qualitative data. The selection of indicators which will be followed up and procedures used for their processing are of fundamental importance. The submitted investigation is an example how to use possibilities to process results of effectiveness analysis in teaching internal medicine by means of computer technique. As an indicator of effectiveness the authors selected the percentage of students who had an opportunity during the given period of their studies to observe a certain pathological condition, and as method of data collection a survey by means of questionnaires was used. The task permits to differentiate the students' experience (whether the student examined the patient himself or whether the patient was only demonstrated) and it makes it possible to differentiate the place of observation (at the university teaching hospital or regional non-teaching hospital attachment). The task permits also to form sub-groups of respondents to combine them as desired and to compare their results. The described computer programme support comprises primary processing of the output of the questionnaire survey. The questionnaires are transformed and stored by groups of respondents in data files of suitable format (programme SDFORM); the processing of results is described as well as their presentation as output listing or on the display in the interactive way (SDRESULT programme). Using the above programmes, the authors processed the results of a survey made among students during and after completion of the studies in a series of 70 recommended pathological conditions. As an example the authors compare results of observations in 20 selected pathological conditions important for the diagnosis and therapy in primary care in the final stage of the medical course in 1981 and 1985.

  2. Rapid and inexpensive body fluid identification by RNA profiling-based multiplex High Resolution Melt (HRM) analysis.

    PubMed

    Hanson, Erin K; Ballantyne, Jack

    2013-01-01

    Positive identification of the nature of biological material present on evidentiary items can be crucial for understanding the circumstances surrounding a crime. However, traditional protein-based methods do not permit the identification of all body fluids and tissues, and thus molecular based strategies for the conclusive identification of all forensically relevant biological fluids and tissues need to be developed. Messenger RNA (mRNA) profiling is an example of such a molecular-based approach. Current mRNA body fluid identification assays involve capillary electrophoresis (CE) or quantitative RT-PCR (qRT-PCR) platforms, each with its own limitations. Both platforms require the use of expensive fluorescently labeled primers or probes. CE-based assays require separate amplification and detection steps thus increasing the analysis time. For qRT-PCR assays, only 3-4 markers can be included in a single reaction since each requires a different fluorescent dye. To simplify mRNA profiling assays, and reduce the time and cost of analysis, we have developed single- and multiplex body fluid High Resolution Melt (HRM) assays for the identification of common forensically relevant biological fluids and tissues. The incorporated biomarkers include IL19 (vaginal secretions), IL1F7 (skin), ALAS2 (blood), MMP10 (menstrual blood), HTN3 (saliva) and TGM4 (semen).  The HRM assays require only unlabeled PCR primers and a single saturating intercalating fluorescent dye (Eva Green). Each body-fluid-specific marker can easily be identified by the presence of a distinct melt peak. Usually, HRM assays are used to detect variants or isoforms for a single gene target. However, we have uniquely developed duplex and triplex HRM assays to permit the simultaneous detection of multiple targets per reaction. Here we describe the development and initial performance evaluation of the developed HRM assays. The results demonstrate the potential use of HRM assays for rapid, and relatively inexpensive, screening of biological evidence.

  3. Addressing the amorphous content issue in quantitative phase analysis : the certification of NIST SRM 676a.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cline, J. P.; Von Dreele, R. B.; Winburn, R.

    2011-07-01

    A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum ({alpha}-Al{sub 2}O{sub 3}) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Undermore » the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% {+-} 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.« less

  4. Addressing the Amorphous Content Issue in Quantitative Phase Analysis: The Certification of NIST Standard Reference Material 676a

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Cline; R Von Dreele; R Winburn

    2011-12-31

    A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum ({alpha}-Al{sub 2}O{sub 3}) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Undermore » the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% {+-} 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.« less

  5. Characterization and quantitative analysis of surfactants in textile wastewater by liquid chromatography/quadrupole-time-of-flight mass spectrometry.

    PubMed

    González, Susana; Petrović, Mira; Radetic, Maja; Jovancic, Petar; Ilic, Vesna; Barceló, Damià

    2008-05-01

    A method based on the application of ultra-performance liquid chromatography (UPLC) coupled to hybrid quadrupole-time-of-flight mass spectrometry (QqTOF-MS) with an electrospray (ESI) interface has been developed for the screening and confirmation of several anionic and non-ionic surfactants: linear alkylbenzenesulfonates (LAS), alkylsulfate (AS), alkylethersulfate (AES), dihexyl sulfosuccinate (DHSS), alcohol ethoxylates (AEOs), coconut diethanolamide (CDEA), nonylphenol ethoxylates (NPEOs), and their degradation products (nonylphenol carboxylate (NPEC), octylphenol carboxylate (OPEC), 4-nonylphenol (NP), 4-octylphenol (OP) and NPEO sulfate (NPEO-SO4). The developed methodology permits reliable quantification combined with a high accuracy confirmation based on the accurate mass of the (de)protonated molecules in the TOFMS mode. For further confirmation of the identity of the detected compounds the QqTOF mode was used. Accurate masses of product ions obtained by performing collision-induced dissociation (CID) of the (de)protonated molecules of parent compounds were matched with the ions obtained for a standard solution. The method was applied for the quantitative analysis and high accuracy confirmation of surfactants in complex mixtures in effluents from the textile industry. Positive identification of the target compounds was based on accurate mass measurement of the base peak, at least one product ion and the LC retention time of the analyte compared with that of a standard. The most frequently surfactants found in these textile effluents were NPEO and NPEO-SO4 in concentrations ranging from 0.93 to 5.68 mg/L for NPEO and 0.06 to 4.30 mg/L for NPEO-SO4. AEOs were also identified.

  6. WIPP Hazardous Waste Facility Permit Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kehrman, B.; Most, W.

    2006-07-01

    The Waste Isolation Pilot Plant (WIPP) Hazardous Waste Facility Permit (HWFP) was issued on October 27, 1999 [1]. Since that time, the WIPP has sought modifications to clarify the permit language, provide alternative methods for meeting permit requirements and to update permit conditions. Significant advancements have been made in transuranic (TRU) waste management as the result of modifications to the HWFP. Among these advancements is a modification to obtain a drum age criteria (DAC) value to perform headspace gas sampling on drums to be super-compacted and placed in a 100-gallon overpack drum. In addition, the Section 311 permit modification requestmore » that would allow for more efficient waste characterization, and the modification to authorize the shipment and disposal of Remote-Handled (RH) TRU waste were merged together and submitted to the regulator as the Consolidated Permit Modification Request (PMR). The submittal of the Consolidated PMR came at the request of the regulator as part of responses to Notices of Deficiency (NODs) for the separate PMRs which had been submitted in previous years. Section 311 of the fiscal year 2004 Energy and Water Developments Appropriations Act (Public Law 108-137) [2] directs the Department of Energy to submit a permit modification that limits waste confirmation to radiography or visual examination of a statistical subpopulation of containers. Section 311 also specifically directs that disposal room performance standards be to be met by monitoring for volatile organic compounds in the underground disposal rooms. This statute translates into the elimination of other waste confirmation methods such as headspace gas sampling and analysis and solids sampling and analysis. These methods, as appropriate, will continue to be used by the generator sites during hazardous waste determinations or characterization activities. This modification is expected to reduce the overall cost of waste analysis by hundreds of millions of dollars [3]. Combining both the chap. 311 and RH TRU waste permit modification requests allows for both the regulator and DOE to expedite action on the modification requests. The Combined PMR reduces costs by having only one administrative process for both modification requests. (authors)« less

  7. Methods for Kinetic and Thermodynamic Analysis of Aminoacyl-tRNA Synthetases

    PubMed Central

    Francklyn, Christopher S.; First, Eric A.; Perona, John J.; Hou, Ya-Ming

    2008-01-01

    The accuracy of protein synthesis relies on the ability of aminoacyl-tRNA synthetases (aaRSs) to discriminate among true and near cognate substrates. To date, analysis of aaRSs function, including identification of residues of aaRS participating in amino acid and tRNA discrimination, has largely relied on the steady state kinetic pyrophosphate exchange and aminoacylation assays. Pre-steady state kinetic studies investigating a more limited set of aaRS systems have also been undertaken to assess the energetic contributions of individual enzyme-substrate interactions, particularly in the adenylation half reaction. More recently, a renewed interest in the use of rapid kinetics approaches for aaRSs has led to their application to several new aaRS systems, resulting in the identification of mechanistic differences that distinguish the two structurally distinct aaRS classes. Here, we review the techniques for thermodynamic and kinetic analysis of aaRS function. Following a brief survey of methods for the preparation of materials and for steady state kinetic analysis, this review will describe pre-steady state kinetic methods employing rapid quench and stopped-flow fluorescence for analysis of the activation and aminoacyl transfer reactions. Application of these methods to any aaRS system allows the investigator to derive detailed kinetic mechanisms for the activation and aminoacyl transfer reactions, permitting issues of substrate specificity, stereochemical mechanism, and inhibitor interaction to be addressed in a rigorous and quantitative fashion. PMID:18241792

  8. Bone-marrow densitometry: Assessment of marrow space of human vertebrae by single energy high resolution-quantitative computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peña, Jaime A.; Damm, Timo; Bastgen, Jan

    Purpose: Accurate noninvasive assessment of vertebral bone marrow fat fraction is important for diagnostic assessment of a variety of disorders and therapies known to affect marrow composition. Moreover, it provides a means to correct fat-induced bias of single energy quantitative computed tomography (QCT) based bone mineral density (BMD) measurements. The authors developed new segmentation and calibration methods to obtain quantitative surrogate measures of marrow-fat density in the axial skeleton. Methods: The authors developed and tested two high resolution-QCT (HR-QCT) based methods which permit segmentation of bone voids in between trabeculae hypothesizing that they are representative of bone marrow space. Themore » methods permit calculation of marrow content in units of mineral equivalent marrow density (MeMD). The first method is based on global thresholding and peeling (GTP) to define a volume of interest away from the transition between trabecular bone and marrow. The second method, morphological filtering (MF), uses spherical elements of different radii (0.1–1.2 mm) and automatically places them in between trabeculae to identify regions with large trabecular interspace, the bone-void space. To determine their performance, data were compared ex vivo to high-resolution peripheral CT (HR-pQCT) images as the gold-standard. The performance of the methods was tested on a set of excised human vertebrae with intact bone marrow tissue representative of an elderly population with low BMD. Results: 86% (GTP) and 87% (MF) of the voxels identified as true marrow space on HR-pQCT images were correctly identified on HR-QCT images and thus these volumes of interest can be considered to be representative of true marrow space. Within this volume, MeMD was estimated with residual errors of 4.8 mg/cm{sup 3} corresponding to accuracy errors in fat fraction on the order of 5% both for GTP and MF methods. Conclusions: The GTP and MF methods on HR-QCT images permit noninvasive localization and densitometric assessment of marrow fat with residual accuracy errors sufficient to study disorders and therapies known to affect bone marrow composition. Additionally, the methods can be used to correct BMD for fat induced bias. Application and testing in vivo and in longitudinal studies are warranted to determine the clinical performance and value of these methods.« less

  9. Novel cardiac magnetic resonance biomarkers: native T1 and extracellular volume myocardial mapping.

    PubMed

    Cannaò, Paola Maria; Altabella, Luisa; Petrini, Marcello; Alì, Marco; Secchi, Francesco; Sardanelli, Francesco

    2016-04-28

    Cardiac magnetic resonance (CMR) is a non-invasive diagnostic tool playing a key role in the assessment of cardiac morphology and function as well as in tissue characterization. Late gadolinium enhancement is a fundamental CMR technique for detecting focal or regional abnormalities such as scar tissue, replacement fibrosis, or inflammation using qualitative, semi-quantitative, or quantitative methods, but not allowing for evaluating the whole myocardium in the presence of diffuse disease. The novel T1 mapping approach permits a quantitative assessment of the entire myocardium providing a voxel-by-voxel map of native T1 relaxation time, obtained before the intravenous administration of gadolinium-based contrast material. Combining T1 data obtained before and after contrast injection, it is also possible to calculate the voxel-by-voxel extracellular volume (ECV), resulting in another myocardial parametric map. This article describes technical challenges and clinical perspectives of these two novel CMR biomarkers: myocardial native T1 and ECV mapping.

  10. Reverse transcription-polymerase chain reaction molecular testing of cytology specimens: Pre-analytic and analytic factors.

    PubMed

    Bridge, Julia A

    2017-01-01

    The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.

  11. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  12. Phased laser diode array permits selective excitation of ultrasonic guided waves in coated bone-mimicking tubes

    NASA Astrophysics Data System (ADS)

    Moilanen, Petro; Salmi, Ari; Kilappa, Vantte; Zhao, Zuomin; Timonen, Jussi; Hæggström, Edward

    2017-10-01

    This paper validates simulation predictions, which state that specific modes could be enhanced in quantitative ultrasonic bone testing. Tunable selection of ultrasonic guided wave excitation is useful in non-destructive testing since it permits the mediation of energy into diagnostically useful modes while reducing the energy mediated into disturbing contributions. For instance, it is often challenging to distinguish and extract the useful modes from ultrasound signals measured in bone covered by a soft tissue. We show that a laser diode array can selectively excite ultrasound in bone mimicking phantoms. A fiber-coupled diode array (4 elements) illuminated two solid tubes (2-3 mm wall thickness) embraced by an opaque soft-tissue mimicking elastomer coating (5 mm thick). A predetermined time delay matching the selected mode and frequency was employed between the outputs of the elements. The generated ultrasound was detected by a 215 kHz piezo receiver. Our results suggest that this array reduces the disturbances caused by the elastomer cover and so pave way to permit non-contacting in vivo guided wave ultrasound assessment of human bones. The implementation is small, inexpensive, and robust in comparison with the conventional pulsed lasers.

  13. Constraining OCT with Knowledge of Device Design Enables High Accuracy Hemodynamic Assessment of Endovascular Implants.

    PubMed

    O'Brien, Caroline C; Kolandaivelu, Kumaran; Brown, Jonathan; Lopes, Augusto C; Kunio, Mie; Kolachalama, Vijaya B; Edelman, Elazer R

    2016-01-01

    Stacking cross-sectional intravascular images permits three-dimensional rendering of endovascular implants, yet introduces between-frame uncertainties that limit characterization of device placement and the hemodynamic microenvironment. In a porcine coronary stent model, we demonstrate enhanced OCT reconstruction with preservation of between-frame features through fusion with angiography and a priori knowledge of stent design. Strut positions were extracted from sequential OCT frames. Reconstruction with standard interpolation generated discontinuous stent structures. By computationally constraining interpolation to known stent skeletons fitted to 3D 'clouds' of OCT-Angio-derived struts, implant anatomy was resolved, accurately rendering features from implant diameter and curvature (n = 1 vessels, r2 = 0.91, 0.90, respectively) to individual strut-wall configurations (average displacement error ~15 μm). This framework facilitated hemodynamic simulation (n = 1 vessel), showing the critical importance of accurate anatomic rendering in characterizing both quantitative and basic qualitative flow patterns. Discontinuities with standard approaches systematically introduced noise and bias, poorly capturing regional flow effects. In contrast, the enhanced method preserved multi-scale (local strut to regional stent) flow interactions, demonstrating the impact of regional contexts in defining the hemodynamic consequence of local deployment errors. Fusion of planar angiography and knowledge of device design permits enhanced OCT image analysis of in situ tissue-device interactions. Given emerging interests in simulation-derived hemodynamic assessment as surrogate measures of biological risk, such fused modalities offer a new window into patient-specific implant environments.

  14. Power and sample-size estimation for microbiome studies using pairwise distances and PERMANOVA.

    PubMed

    Kelly, Brendan J; Gross, Robert; Bittinger, Kyle; Sherrill-Mix, Scott; Lewis, James D; Collman, Ronald G; Bushman, Frederic D; Li, Hongzhe

    2015-08-01

    The variation in community composition between microbiome samples, termed beta diversity, can be measured by pairwise distance based on either presence-absence or quantitative species abundance data. PERMANOVA, a permutation-based extension of multivariate analysis of variance to a matrix of pairwise distances, partitions within-group and between-group distances to permit assessment of the effect of an exposure or intervention (grouping factor) upon the sampled microbiome. Within-group distance and exposure/intervention effect size must be accurately modeled to estimate statistical power for a microbiome study that will be analyzed with pairwise distances and PERMANOVA. We present a framework for PERMANOVA power estimation tailored to marker-gene microbiome studies that will be analyzed by pairwise distances, which includes: (i) a novel method for distance matrix simulation that permits modeling of within-group pairwise distances according to pre-specified population parameters; (ii) a method to incorporate effects of different sizes within the simulated distance matrix; (iii) a simulation-based method for estimating PERMANOVA power from simulated distance matrices; and (iv) an R statistical software package that implements the above. Matrices of pairwise distances can be efficiently simulated to satisfy the triangle inequality and incorporate group-level effects, which are quantified by the adjusted coefficient of determination, omega-squared (ω2). From simulated distance matrices, available PERMANOVA power or necessary sample size can be estimated for a planned microbiome study. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Analysis of permit vehicle loads in Wisconsin.

    DOT National Transportation Integrated Search

    2009-09-30

    This study evaluated the impact of the 250-kip Wisconsin Standard Permit Vehicle against the overloaded vehicles operating on Wisconsin roads in recent years. The evaluation was conducted using three sets of data: 1) overloaded vehicle records within...

  16. A Probabilistic Framework for the Validation and Certification of Computer Simulations

    NASA Technical Reports Server (NTRS)

    Ghanem, Roger; Knio, Omar

    2000-01-01

    The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present representation, in addition to capturing significantly more information than the traditional approach, facilitates the linkage between different interacting stochastic systems as is typically observed in real-life situations.

  17. Embryonic Explant Culture: Studying Effects of Regulatory Molecules on Gene Expression in Craniofacial Tissues.

    PubMed

    Närhi, Katja

    2017-01-01

    The ex vivo culture of embryonic tissue explants permits the continuous monitoring of growth and morphogenesis at specific embryonic stages. The functions of soluble regulatory molecules can be analyzed by introducing them into culture medium or locally with beads to the tissue. Gene expression in the manipulated tissue explants can be analyzed using in situ hybridization, quantitative PCR, and reporter constructs combined to organ culture to examine the functions of the signaling molecules.

  18. Rainbow schlieren and its applications

    NASA Technical Reports Server (NTRS)

    Howes, W. L.

    1984-01-01

    In this modification of the schlieren apparatus the knife-edge is replaced by a radial-rainbow filter with a transparent center and opaque surround. Consequently, refractive-index inhomogeneities in the test section appear varicolored, whereas uniformities appear white. The rainbow schlieren is simple, is easy to use, and accentuates detail regarding inhomogeneities more than the ordinary schlieren. The rainbow schlieren permits quantitative evaluation of certain refractive-index distributions, including turbulence, by simple calculations from observations of hue rather than irradiance.

  19. Symposium Proceedings on Quantitative Feedback Theory Held in Fairborn, Ohio on 2-4 August 1992.

    DTIC Science & Technology

    1992-08-01

    modification. This permits a drastic reduction in the cost of feedback, in terms of loop bandwidth and effect of sensor noise . This is the first...High- frequency Bound ( UHB ) but its main use is to ensure that at high frequencies the controlled system cannot go unstable and has sufficient noise ...a 5-cascaded multiple-loop feedback system giving significant reductions in sensor noise amplification (peak reduced by a factor of 4), is

  20. Integrating service development with evaluation in telehealthcare: an ethnographic study

    PubMed Central

    Finch, Tracy; May, Carl; Mair, Frances; Mort, Maggie; Gask, Linda

    2003-01-01

    Objectives To identify issues that facilitate the successful integration of evaluation and development of telehealthcare services. Design Ethnographic study using various qualitative research techniques to obtain data from several sources, including in-depth semistructured interviews, project steering group meetings, and public telehealthcare meetings. Setting Seven telehealthcare evaluation projects (four randomised controlled trials and three pragmatic service evaluations) in the United Kingdom, studied over two years. Projects spanned a range of specialties—dermatology, psychiatry, respiratory medicine, cardiology, and oncology. Participants Clinicians, managers, technical experts, and researchers involved in the projects. Results and discussion Key problems in successfully integrating evaluation and service development in telehealthcare are, firstly, defining existing clinical practices (and anticipating changes) in ways that permit measurement; secondly, managing additional workload and conflicting responsibilities brought about by combining clinical and research responsibilities (including managing risk); and, thirdly, understanding various perspectives on effectiveness and the limitations of evaluation results beyond the context of the research study. Conclusions Combined implementation and evaluation of telehealthcare systems is complex, and is often underestimated. The distinction between quantitative outcomes and the workability of the system is important for producing evaluative knowledge that is of practical value. More pragmatic approaches to evaluation, that permit both quantitative and qualitative methods, are required to improve the quality of such research and its relevance for service provision in the NHS. PMID:14630758

  1. Fully automated, internally controlled quantification of hepatitis B Virus DNA by real-time PCR by use of the MagNA Pure LC and LightCycler instruments.

    PubMed

    Leb, Victoria; Stöcher, Markus; Valentine-Thon, Elizabeth; Hölzl, Gabriele; Kessler, Harald; Stekel, Herbert; Berg, Jörg

    2004-02-01

    We report on the development of a fully automated real-time PCR assay for the quantitative detection of hepatitis B virus (HBV) DNA in plasma with EDTA (EDTA plasma). The MagNA Pure LC instrument was used for automated DNA purification and automated preparation of PCR mixtures. Real-time PCR was performed on the LightCycler instrument. An internal amplification control was devised as a PCR competitor and was introduced into the assay at the stage of DNA purification to permit monitoring for sample adequacy. The detection limit of the assay was found to be 200 HBV DNA copies/ml, with a linear dynamic range of 8 orders of magnitude. When samples from the European Union Quality Control Concerted Action HBV Proficiency Panel 1999 were examined, the results were found to be in acceptable agreement with the HBV DNA concentrations of the panel members. In a clinical laboratory evaluation of 123 EDTA plasma samples, a significant correlation was found with the results obtained by the Roche HBV Monitor test on the Cobas Amplicor analyzer within the dynamic range of that system. In conclusion, the newly developed assay has a markedly reduced hands-on time, permits monitoring for sample adequacy, and is suitable for the quantitative detection of HBV DNA in plasma in a routine clinical laboratory.

  2. Loudspeaker line array educational demonstration.

    PubMed

    Anderson, Brian E; Moser, Brad; Gee, Kent L

    2012-03-01

    This paper presents a physical demonstration of an audio-range line array used to teach interference of multiple sources in a classroom or laboratory exercise setting. Software has been developed that permits real-time control and steering of the array. The graphical interface permits a user to vary the frequency, the angular response by phase shading, and reduce sidelobes through amplitude shading. An inexpensive, eight-element loudspeaker array has been constructed to test the control program. Directivity measurements of this array in an anechoic chamber and in a large classroom are presented. These measurements have good agreement with theoretical directivity predictions, thereby allowing its use as a quantitative learning tool for advanced students as well as a qualitative demonstration of arrays in other settings. Portions of this paper are directed toward educators who may wish to implement a similar demonstration for their advanced undergraduate or graduate level course in acoustics. © 2012 Acoustical Society of America

  3. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  4. Quantitative X-ray dark-field and phase tomography using single directional speckle scanning technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongchang, E-mail: hongchang.wang@diamond.ac.uk; Kashyap, Yogesh; Sawhney, Kawal

    2016-03-21

    X-ray dark-field contrast tomography can provide important supplementary information inside a sample to the conventional absorption tomography. Recently, the X-ray speckle based technique has been proposed to provide qualitative two-dimensional dark-field imaging with a simple experimental arrangement. In this letter, we deduce a relationship between the second moment of scattering angle distribution and cross-correlation degradation of speckle and establish a quantitative basis of X-ray dark-field tomography using single directional speckle scanning technique. In addition, the phase contrast images can be simultaneously retrieved permitting tomographic reconstruction, which yields enhanced contrast in weakly absorbing materials. Such complementary tomography technique can allow systematicmore » investigation of complex samples containing both soft and hard materials.« less

  5. Determination of salicylic acid by HPLC in plasma and saliva from children with juvenile chronic arthritis.

    PubMed

    Legaz, M E; Acitores, E; Valverde, F

    1992-12-01

    A high performance liquid chromatography (HPLC) method has been developed for measuring salicylic acid in the plasma and saliva of children with juvenile chronic arthritis (JCA). Samples were extracted with diethyl ether and, after drying, redissolved in methanol to be chromatographed. Quantitation of salicylic acid was performed by reverse phase HPLC on a spherisorb ODS-2 column, using methanol: water: acetic acid as mobile phase. Phenolic was monitored by absorbance at 237 nm. Linearity between the amount of mass injected and the response in the detector was determined. This method was applied to compare concentrations of salivary and plasma salicylic acid. The method also permitted the quantitation of salivary salicylate as a non-invasive, indirect method for monitoring the concentration of plasma salicylate in patients with JCA.

  6. Crossing the Barriers: An Analysis of Permitting Barriers to Geothermal Development and Potential Improvement Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Aaron L; Young, Katherine R

    Developers have identified many non-technical barriers to geothermal power development, including permitting. Activities required for permitting, such as the associated environmental reviews, can take a considerable amount of time and delay project development. This paper discusses the impacts to geothermal development timelines due to the permitting challenges, including the regulatory framework, environmental review process, and ancillary permits. We identified barriers that have the potential to prevent geothermal development or delay timelines and defined improvement scenarios that could assist in expediting geothermal development and permitting timelines and lead to the deployment of additional geothermal resources by 2030 and 2050: (1) themore » creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices as well as (2) an expansion of existing categorical exclusions applicable to geothermal development on Bureau of Land Management public lands to include the oil and gas categorical exclusions passed as part of the Energy Policy Act of 2005. We utilized the Regional Energy Deployment System (ReEDS) and the Geothermal Electricity Technology Evaluation Model (GETEM) to forecast baseline geothermal deployment based on previous analysis of geothermal project development and permitting timelines. The model results forecast that reductions in geothermal project timelines can have a significant impact on geothermal deployment. For example, using the ReEDS model, we estimated that reducing timelines by two years, perhaps due to the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices, could result in deployment of an additional 204 MW by 2030 and 768 MW by 2050 - a 13% improvement when compared to the business as usual scenario. The model results forecast that a timeline improvement of four years - for example with an expansion of existing categorical exclusions coupled with the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices - could result in deployment of an additional 2,529 MW of geothermal capacity by 2030 and 6,917 MW of geothermal capacity by 2050 - an improvement of 116% when compared to the business as usual scenario. These results suggest that reducing development timelines could be a large driver in the deployment of geothermal resources.« less

  7. Stage 2 tool user’s manual.

    DOT National Transportation Integrated Search

    2017-08-01

    The purpose of the Permitted Overweight Truck Corridor Analysis Tool (referred to in this document as the Stage 2 Tool) is to evaluate existing or to create new proposed overweight (OW) truck corridors to estimate the permitted OW truck, pavement, br...

  8. Oversize/overweight permitting practices review : phase II.

    DOT National Transportation Integrated Search

    2013-02-01

    This study explores a more detailed analysis of the permitting process in the Mid-Atlantic Region and : delves into operational practice, and theory and history of the practice among states. The states : practices examined in greater detail include C...

  9. Using mixed methods to assess fidelity of delivery and its influencing factors in a complex self-management intervention for people with osteoarthritis and low back pain.

    PubMed

    Toomey, Elaine; Matthews, James; Hurley, Deirdre A

    2017-08-04

    Despite an increasing awareness of the importance of fidelity of delivery within complex behaviour change interventions, it is often poorly assessed. This mixed methods study aimed to establish the fidelity of delivery of a complex self-management intervention and explore the reasons for these findings using a convergent/triangulation design. Feasibility trial of the Self-management of Osteoarthritis and Low back pain through Activity and Skills (SOLAS) intervention (ISRCTN49875385), delivered in primary care physiotherapy. 60 SOLAS sessions were delivered across seven sites by nine physiotherapists. Fidelity of delivery of prespecified intervention components was evaluated using (1) audio-recordings (n=60), direct observations (n=24) and self-report checklists (n=60) and (2) individual interviews with physiotherapists (n=9). Quantitatively, fidelity scores were calculated using percentage means and SD of components delivered. Associations between fidelity scores and physiotherapist variables were analysed using Spearman's correlations. Interviews were analysed using thematic analysis to explore potential reasons for fidelity scores. Integration of quantitative and qualitative data occurred at an interpretation level using triangulation. Quantitatively, fidelity scores were high for all assessment methods; with self-report (92.7%) consistently higher than direct observations (82.7%) or audio-recordings (81.7%). There was significant variation between physiotherapists' individual scores (69.8% - 100%). Both qualitative and quantitative data (from physiotherapist variables) found that physiotherapists' knowledge (Spearman's association at p=0.003) and previous experience (p=0.008) were factors that influenced their fidelity. The qualitative data also postulated participant-level (eg, individual needs) and programme-level factors (eg, resources) as additional elements that influenced fidelity. The intervention was delivered with high fidelity. This study contributes to the limited evidence regarding fidelity assessment methods within complex behaviour change interventions. The findings suggest a combination of quantitative methods is suitable for the assessment of fidelity of delivery. A mixed methods approach provided a more insightful understanding of fidelity and its influencing factors. ISRCTN49875385; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Interspecific competition in plants: how well do current methods answer fundamental questions?

    PubMed

    Connolly, J; Wayne, P; Bazzaz, F A

    2001-02-01

    Accurately quantifying and interpreting the processes and outcomes of competition among plants is essential for evaluating theories of plant community organization and evolution. We argue that many current experimental approaches to quantifying competitive interactions introduce size bias, which may significantly impact the quantitative and qualitative conclusions drawn from studies. Size bias generally arises when estimates of competitive ability are erroneously influenced by the initial size of competing individuals. We employ a series of quantitative thought experiments to demonstrate the potential for size bias in analysis of four traditional experimental designs (pairwise, replacement series, additive series, and response surfaces) either when only final measurements are available or when both initial and final measurements are collected. We distinguish three questions relevant to describing competitive interactions: Which species dominates? Which species gains? and How do species affect each other? The choice of experimental design and measurements greatly influences the scope of inference permitted. Conditions under which the latter two questions can give biased information are tabulated. We outline a new approach to characterizing competition that avoids size bias and that improves the concordance between research question and experimental design. The implications of the choice of size metrics used to quantify both the initial state and the responses of elements in interspecific mixtures are discussed. The relevance of size bias in competition studies with organisms other than plants is also discussed.

  11. Development of an ELISA for the Detection of Azaspiracids.

    PubMed

    Samdal, Ingunn A; Løvberg, Kjersti E; Briggs, Lyn R; Kilcoyne, Jane; Xu, Jianyan; Forsyth, Craig J; Miles, Christopher O

    2015-09-09

    Azaspiracids (AZAs) are a group of biotoxins that cause food poisoning in humans. These toxins are produced by small marine dinoflagellates such as Azadinium spinosum and accumulate in shellfish. Ovine polyclonal antibodies were produced and used to develop an ELISA for quantitating AZAs in shellfish, algal cells, and culture supernatants. Immunizing antigens were prepared from synthetic fragments of the constant region of AZAs, while plate coating antigen was prepared from AZA-1. The ELISA provides a sensitive and rapid analytical method for screening large numbers of samples. It has a working range of 0.45-8.6 ng/mL and a limit of quantitation for total AZAs in whole shellfish at 57 μg/kg, well below the maximum permitted level set by the European Commission. The ELISA has good cross-reactivity to AZA-1-10, -33, and -34 and 37-epi-AZA-1. Naturally contaminated Irish mussels gave similar results whether they were cooked or uncooked, indicating that the ELISA also detects 22-carboxy-AZA metabolites (e.g., AZA-17 and AZA-19). ELISA results showed excellent correlation with LC-MS/MS analysis, both for mussel extract spiked with AZA-1 and for naturally contaminated Irish mussels. The assay is therefore well suited to screening for AZAs in shellfish samples intended for human consumption, as well as for studies on AZA metabolism.

  12. Light scattering application for quantitative estimation of apoptosis

    NASA Astrophysics Data System (ADS)

    Bilyy, Rostyslav O.; Stoika, Rostyslav S.; Getman, Vasyl B.; Bilyi, Olexander I.

    2004-05-01

    Estimation of cell proliferation and apoptosis are in focus of instrumental methods used in modern biomedical sciences. Present study concerns monitoring of functional state of cells, specifically the development of their programmed death or apoptosis. The available methods for such purpose are either very expensive, or require time-consuming operations. Their specificity and sensitivity are frequently not sufficient for making conclusions which could be used in diagnostics or treatment monitoring. We propose a novel method for apoptosis measurement based on quantitative determination of cellular functional state taking into account their physical characteristics. This method uses the patented device -- laser microparticle analyser PRM-6 -- for analyzing light scattering by the microparticles, including cells. The method gives an opportunity for quick, quantitative, simple (without complicated preliminary cell processing) and relatively cheap measurement of apoptosis in cellular population. The elaborated method was used for studying apoptosis expression in murine leukemia cells of L1210 line and human lymphoblastic leukemia cells of K562 line. The results obtained by the proposed method permitted measuring cell number in tested sample, detecting and quantitative characterization of functional state of cells, particularly measuring the ratio of the apoptotic cells in suspension.

  13. Guidelines for a graph-theoretic implementation of structural equation modeling

    USGS Publications Warehouse

    Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William

    2012-01-01

    Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.

  14. Fast and accurate modeling of stray light in optical systems

    NASA Astrophysics Data System (ADS)

    Perrin, Jean-Claude

    2017-11-01

    The first problem to be solved in most optical designs with respect to stray light is that of internal reflections on the several surfaces of individual lenses and mirrors, and on the detector itself. The level of stray light ratio can be considerably reduced by taking into account the stray light during the optimization to determine solutions in which the irradiance due to these ghosts is kept to the minimum possible value. Unhappily, the routines available in most optical design software's, for example CODE V, do not permit all alone to make exact quantitative calculations of the stray light due to these ghosts. Therefore, the engineer in charge of the optical design is confronted to the problem of using two different software's, one for the design and optimization, for example CODE V, one for stray light analysis, for example ASAP. This makes a complete optimization very complex . Nevertheless, using special techniques and combinations of the routines available in CODE V, it is possible to have at its disposal a software macro tool to do such an analysis quickly and accurately, including Monte-Carlo ray tracing, or taking into account diffraction effects. This analysis can be done in a few minutes, to be compared to hours with other software's.

  15. Phenotype–genotype correlation in Hirschsprung disease is illuminated by comparative analysis of the RET protein sequence

    PubMed Central

    Kashuk, Carl S.; Stone, Eric A.; Grice, Elizabeth A.; Portnoy, Matthew E.; Green, Eric D.; Sidow, Arend; Chakravarti, Aravinda; McCallion, Andrew S.

    2005-01-01

    The ability to discriminate between deleterious and neutral amino acid substitutions in the genes of patients remains a significant challenge in human genetics. The increasing availability of genomic sequence data from multiple vertebrate species allows inclusion of sequence conservation and physicochemical properties of residues to be used for functional prediction. In this study, the RET receptor tyrosine kinase serves as a model disease gene in which a broad spectrum (≥116) of disease-associated mutations has been identified among patients with Hirschsprung disease and multiple endocrine neoplasia type 2. We report the alignment of the human RET protein sequence with the orthologous sequences of 12 non-human vertebrates (eight mammalian, one avian, and three teleost species), their comparative analysis, the evolutionary topology of the RET protein, and predicted tolerance for all published missense mutations. We show that, although evolutionary conservation alone provides significant information to predict the effect of a RET mutation, a model that combines comparative sequence data with analysis of physiochemical properties in a quantitative framework provides far greater accuracy. Although the ability to discern the impact of a mutation is imperfect, our analyses permit substantial discrimination between predicted functional classes of RET mutations and disease severity even for a multigenic disease such as Hirschsprung disease. PMID:15956201

  16. The use of neural networks and texture analysis for rapid objective selection of regions of interest in cytoskeletal images.

    PubMed

    Derkacs, Amanda D Felder; Ward, Samuel R; Lieber, Richard L

    2012-02-01

    Understanding cytoskeletal dynamics in living tissue is prerequisite to understanding mechanisms of injury, mechanotransduction, and mechanical signaling. Real-time visualization is now possible using transfection with plasmids that encode fluorescent cytoskeletal proteins. Using this approach with the muscle-specific intermediate filament protein desmin, we found that a green fluorescent protein-desmin chimeric protein was unevenly distributed throughout the muscle fiber, resulting in some image areas that were saturated as well as others that lacked any signal. Our goal was to analyze the muscle fiber cytoskeletal network quantitatively in an unbiased fashion. To objectively select areas of the muscle fiber that are suitable for analysis, we devised a method that provides objective classification of regions of images of striated cytoskeletal structures into "usable" and "unusable" categories. This method consists of a combination of spatial analysis of the image using Fourier methods along with a boosted neural network that "decides" on the quality of the image based on previous training. We trained the neural network using the expert opinion of three scientists familiar with these types of images. We found that this method was over 300 times faster than manual classification and that it permitted objective and accurate classification of image regions.

  17. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  18. Forest Connectivity Regions of Canada Using Circuit Theory and Image Analysis

    PubMed Central

    Pelletier, David; Lapointe, Marc-Élie; Wulder, Michael A.; White, Joanne C.; Cardille, Jeffrey A.

    2017-01-01

    Ecological processes are increasingly well understood over smaller areas, yet information regarding interconnections and the hierarchical nature of ecosystems remains less studied and understood. Information on connectivity over large areas with high resolution source information provides for both local detail and regional context. The emerging capacity to apply circuit theory to create maps of omnidirectional connectivity provides an opportunity for improved and quantitative depictions of forest connectivity, supporting the formation and testing of hypotheses about the density of animal movement, ecosystem structure, and related links to natural and anthropogenic forces. In this research, our goal was to delineate regions where connectivity regimes are similar across the boreal region of Canada using new quantitative analyses for characterizing connectivity over large areas (e.g., millions of hectares). Utilizing the Earth Observation for Sustainable Development of forests (EOSD) circa 2000 Landsat-derived land-cover map, we created and analyzed a national-scale map of omnidirectional forest connectivity at 25m resolution over 10000 tiles of 625 km2 each, spanning the forested regions of Canada. Using image recognition software to detect corridors, pinch points, and barriers to movements at multiple spatial scales in each tile, we developed a simple measure of the structural complexity of connectivity patterns in omnidirectional connectivity maps. We then mapped the Circuitscape resistance distance measure and used it in conjunction with the complexity data to study connectivity characteristics in each forested ecozone. Ecozone boundaries masked substantial systematic patterns in connectivity characteristics that are uncovered using a new classification of connectivity patterns that revealed six clear groups of forest connectivity patterns found in Canada. The resulting maps allow exploration of omnidirectional forest connectivity patterns at full resolution while permitting quantitative analyses of connectivity over broad areas, informing modeling, planning and monitoring efforts. PMID:28146573

  19. EQUIFAT: A novel scoring system for the semi-quantitative evaluation of regional adipose tissues in Equidae.

    PubMed

    Morrison, Philippa K; Harris, Patricia A; Maltin, Charlotte A; Grove-White, Dai; Argo, Caroline McG

    2017-01-01

    Anatomically distinct adipose tissues represent variable risks to metabolic health in man and some other mammals. Quantitative-imaging of internal adipose depots is problematic in large animals and associations between regional adiposity and health are poorly understood. This study aimed to develop and test a semi-quantitative system (EQUIFAT) which could be applied to regional adipose tissues. Anatomically-defined, photographic images of adipose depots (omental, mesenteric, epicardial, rump) were collected from 38 animals immediately post-mortem. Images were ranked and depot-specific descriptors were developed (1 = no fat visible; 5 = excessive fat present). Nuchal-crest and ventro-abdominal-retroperitoneal adipose depot depths (cm) were transformed to categorical 5 point scores. The repeatability and reliability of EQUIFAT was independently tested by 24 observers. When half scores were permitted, inter-observer agreement was substantial (average κw: mesenteric, 0.79; omental, 0.79; rump 0.61) or moderate (average κw; epicardial, 0.60). Intra-observer repeatability was tested by 8 observers on 2 occasions. Kappa analysis indicated perfect (omental and mesenteric) and substantial agreement (epicardial and rump) between attempts. A further 207 animals were evaluated ante-mortem (age, height, breed-type, gender, body condition score [BCS]) and again immediately post-mortem (EQUIFAT scores, carcass weight). Multivariable, random effect linear regression models were fitted (breed as random effect; BCS as outcome variable). Only height, carcass weight, omental and retroperitoneal EQUIFAT scores remained as explanatory variables in the final model. The EQUIFAT scores developed here demonstrate clear functional differences between regional adipose depots and future studies could be directed towards describing associations between adiposity and disease risk in surgical and post-mortem situations.

  20. EQUIFAT: A novel scoring system for the semi-quantitative evaluation of regional adipose tissues in Equidae

    PubMed Central

    Morrison, Philippa K.; Harris, Patricia A.; Maltin, Charlotte A.; Grove-White, Dai; Argo, Caroline McG.

    2017-01-01

    Anatomically distinct adipose tissues represent variable risks to metabolic health in man and some other mammals. Quantitative-imaging of internal adipose depots is problematic in large animals and associations between regional adiposity and health are poorly understood. This study aimed to develop and test a semi-quantitative system (EQUIFAT) which could be applied to regional adipose tissues. Anatomically-defined, photographic images of adipose depots (omental, mesenteric, epicardial, rump) were collected from 38 animals immediately post-mortem. Images were ranked and depot-specific descriptors were developed (1 = no fat visible; 5 = excessive fat present). Nuchal-crest and ventro-abdominal-retroperitoneal adipose depot depths (cm) were transformed to categorical 5 point scores. The repeatability and reliability of EQUIFAT was independently tested by 24 observers. When half scores were permitted, inter-observer agreement was substantial (average κw: mesenteric, 0.79; omental, 0.79; rump 0.61) or moderate (average κw; epicardial, 0.60). Intra-observer repeatability was tested by 8 observers on 2 occasions. Kappa analysis indicated perfect (omental and mesenteric) and substantial agreement (epicardial and rump) between attempts. A further 207 animals were evaluated ante-mortem (age, height, breed-type, gender, body condition score [BCS]) and again immediately post-mortem (EQUIFAT scores, carcass weight). Multivariable, random effect linear regression models were fitted (breed as random effect; BCS as outcome variable). Only height, carcass weight, omental and retroperitoneal EQUIFAT scores remained as explanatory variables in the final model. The EQUIFAT scores developed here demonstrate clear functional differences between regional adipose depots and future studies could be directed towards describing associations between adiposity and disease risk in surgical and post-mortem situations. PMID:28296956

  1. Aligning oversize/overweight permit fees with agency costs : critical issues.

    DOT National Transportation Integrated Search

    2013-08-01

    This project provides an elementary analysis of issues and a proposed framework for the state to evaluate cost recovery options : due to OSOW operations. The authors provide a review of current permitting practices, provide a sampling of fee structur...

  2. Photochemical Energy Storage and Electrochemically Triggered Energy Release in the Norbornadiene-Quadricyclane System: UV Photochemistry and IR Spectroelectrochemistry in a Combined Experiment.

    PubMed

    Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg

    2017-07-06

    The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.

  3. A case study examination of structure and function in a state health department chronic disease unit.

    PubMed

    Alongi, Jeanne

    2015-04-01

    I explored the structural and operational practices of the chronic disease prevention and control unit of a state health department and proposed a conceptual model of structure, function, and effectiveness for future study. My exploratory case study examined 7 elements of organizational structure and practice. My interviews with staff and external stakeholders of a single chronic disease unit yielded quantitative and qualitative data that I coded by perspective, process, relationship, and activity. I analyzed these for patterns and emerging themes. Chi-square analysis revealed significant correlations among collaboration with goal ambiguity, political support, and responsiveness, and evidence-based decisions with goal ambiguity and responsiveness. Although my study design did not permit conclusions about causality, my findings suggested that some elements of the model might facilitate effectiveness for chronic disease units and should be studied further. My findings might have important implications for identifying levers around which capacity can be built that may strengthen effectiveness.

  4. A quantitative description of normal AV nodal conduction curve in man.

    PubMed

    Teague, S; Collins, S; Wu, D; Denes, P; Rosen, K; Arzbaecher, R

    1976-01-01

    The AV nodal conduction curve generated by the atrial extrastimulus technique has been described only qualitatively in man, making clinical comparison of known normal curves with those of suspected AV nodal dysfunction difficult. Also, the effects of physiological and pharmacological interventions have not been quantifiable. In 50 patients with normal AV conduction as defined by normal AH (less than 130 ms), normal AV nodal effective and functional refractory periods (less than 380 and less than 500 ms), and absence of demonstrable dual AV nodal pathways, we found that conduction curves (at sinus rhythm or longest paced cycle length) can be described by an exponential equation of the form delta = Ae-Bx. In this equation, delta is the increase in AV nodal conduction time of an extrastimulus compared to that of a regular beat and x is extrastimulus interval. The natural logarithm of this equation is linear in the semilogarithmic plane, thus permitting the constants A and B to be easily determined by a least-squares regression analysis with a hand calculator.

  5. Optimum thermal infrared bands for mapping general rock type and temperature from space

    NASA Technical Reports Server (NTRS)

    Holmes, Q. A.; Nueesch, D. R.; Vincent, R. K.

    1980-01-01

    A study was carried out to determine quantitatively the number and location of spectral bands required to perform general rock type discrimination from spaceborne imaging sensors using only thermal infrared measurements. Beginning with laboratory spectra collected under idealized conditions from relatively well-characterized homogeneous samples, a radiative transfer model was used to transform ground exitance values into the corresponding spectral radiance at the top of the atmosphere. Taking sensor noise into account, analysis of these data revealed that three 1 micron wide spectral bands would permit independent estimations of rock type and sample temperature from a satellite infrared multispectral scanner. This study, which ignores the mixing of terrain elements within the instantaneous field of view of a satellite scanner, indicates that the location of three spectral bands at 8.1-9.1, 9.5-10.5, and 11.0-12.0 microns, and the employment of appropriate preprocessing to minimize atmospheric effects makes it possible to predict general rock type and temperature for a variety of atmospheric states and temperatures.

  6. A quantitative review of overjustification effects in persons with intellectual and developmental disabilities.

    PubMed

    Levy, Allison; DeLeon, Iser G; Martinez, Catherine K; Fernandez, Nathalie; Gage, Nicholas A; Sigurdsson, Sigurdur Óli; Frank-Crawford, Michelle A

    2017-04-01

    The overjustification hypothesis suggests that extrinsic rewards undermine intrinsic motivation. Extrinsic rewards are common in strengthening behavior in persons with intellectual and developmental disabilities; we examined overjustification effects in this context. A literature search yielded 65 data sets permitting comparison of responding during an initial no-reinforcement phase to a subsequent no-reinforcement phase, separated by a reinforcement phase. We used effect sizes to compare response levels in these two no-reinforcement phases. Overall, the mean effect size did not differ from zero; levels in the second no-reinforcement phase were equally likely to be higher or lower than in the first. However, in contrast to the overjustification hypothesis, levels were higher in the second no-reinforcement phase when comparing the single no-reinforcement sessions immediately before and after reinforcement. Outcomes consistent with the overjustification hypothesis were somewhat more likely when the target behavior occurred at relatively higher levels prior to reinforcement. © 2016 Society for the Experimental Analysis of Behavior.

  7. Clinical Imaging of Bone Microarchitecture with HR-pQCT

    PubMed Central

    Nishiyama, Kyle K.; Shane, Elizabeth

    2014-01-01

    Osteoporosis, a disease characterized by loss of bone mass and structural deterioration, is currently diagnosed by dual-energy x-ray absorptiometry (DXA). However, DXA does not provide information about bone microstructure, which is a key determinant of bone strength. Recent advances in imaging permit the assessment of bone microstructure in vivo using high-resolution peripheral quantitative computed tomography (HR-pQCT). From these data, novel image processing techniques can be applied to characterize bone quality and strength. To date, most HR-pQCT studies are cross-sectional comparing subjects with and without fracture. These studies have shown that HR-pQCT is capable of discriminating fracture status independent of DXA. Recent longitudinal studies present new challenges in terms of analyzing the same region of interest and multisite calibrations. Careful application of analysis techniques and educated clinical interpretation of HR-pQCT results have improved our understanding of various bone-related diseases and will no doubt continue to do so in the future. PMID:23504496

  8. Stress and efficiency studies in EFG

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The goals of this program were: (1) to define minimum stress configurations for silicon sheet growth at high speeds; (2) to quantify dislocation electrical activity and their limits on minority carrier diffusion length in deformed silicon; and (3) to study reasons for degradation of lifetime with increases in doping level in edge-defined film-fed growth (EFG) materials. A finite element model was developed for calculating residual stress with plastic deformation. A finite element model was verified for EFG control variable relationships to temperature field of the sheet to permit prediction of profiles and stresses encountered in EFG systems. A residual stress measurement technique was developed for finite size EFG material blanks using shadow Moire interferometry. Transient creep response of silicon was investigated in the temperature range between 800 and 1400 C in strain and strain regimes of interest in stress analysis of sheet growth. Quantitative relationships were established between minority carrier diffusion length and dislocation densities using Electron Beam Induced Current (EBIC) measurement in FZ silicon deformed in four point bending tests.

  9. Observation of the time-course for peptidoglycan lipid intermediate II polymerization by Staphylococcus aureus monofunctional transglycosylase.

    PubMed

    Braddick, Darren; Sandhu, Sandeep; Roper, David I; Chappell, Michael J; Bugg, Timothy D H

    2014-08-01

    The polymerization of lipid intermediate II by the transglycosylase activity of penicillin-binding proteins (PBPs) represents an important target for antibacterial action, but limited methods are available for quantitative assay of this reaction, or screening potential inhibitors. A new labelling method for lipid II polymerization products using Sanger's reagent (fluoro-2,4-dinitrobenzene), followed by gel permeation HPLC analysis, has permitted the observation of intermediate polymerization products for Staphylococcus aureus monofunctional transglycosylase MGT. Peak formation is inhibited by 6 µM ramoplanin or enduracidin. Characterization by mass spectrometry indicates the formation of tetrasaccharide and octasaccharide intermediates, but not a hexasaccharide intermediate, suggesting a dimerization of a lipid-linked tetrasaccharide. Numerical modelling of the time-course data supports a kinetic model involving addition to lipid-linked tetrasaccharide of either lipid II or lipid-linked tetrasaccharide. Observation of free octasaccharide suggests that hydrolysis of the undecaprenyl diphosphate lipid carrier occurs at this stage in peptidoglycan transglycosylation. © 2014 The Authors.

  10. A Case Study Examination of Structure and Function in a State Health Department Chronic Disease Unit

    PubMed Central

    2015-01-01

    Objectives. I explored the structural and operational practices of the chronic disease prevention and control unit of a state health department and proposed a conceptual model of structure, function, and effectiveness for future study. Methods. My exploratory case study examined 7 elements of organizational structure and practice. My interviews with staff and external stakeholders of a single chronic disease unit yielded quantitative and qualitative data that I coded by perspective, process, relationship, and activity. I analyzed these for patterns and emerging themes. Results. Chi-square analysis revealed significant correlations among collaboration with goal ambiguity, political support, and responsiveness, and evidence-based decisions with goal ambiguity and responsiveness. Conclusions. Although my study design did not permit conclusions about causality, my findings suggested that some elements of the model might facilitate effectiveness for chronic disease units and should be studied further. My findings might have important implications for identifying levers around which capacity can be built that may strengthen effectiveness. PMID:25689211

  11. Chemical fingerprinting of valeriana species: simultaneous determination of valerenic acids, flavonoids, and phenylpropanoids using liquid chromatography with ultraviolet detection.

    PubMed

    Navarrete, Andres; Avula, Bharathi; Choi, Young-Whan; Khan, Ikhlas A

    2006-01-01

    The roots and rhizomes of various valeriana species are currently used as a sleeping aid or mild sedative. A liquid chromatography method has been developed that permits the analysis of chlorogenic acid, lignans, flavonoids, valerenic acids, and valpotrates in various valerian samples. The best results were obtained with a Phenomenex Luna C18(2) column using gradient elution with a mobile phase consisting of water and 0.05% phosphoric acid and 2-100% acetonitrile-methanol (1 + 1) with 0.05% phosphoric acid. The flow rate was 0.8 mL/min and ultraviolet detection was at 207, 225, 254, 280, and 325 nm. Different valerian species and commercial products showed remarkable quantitative variations. Chlorogenic acid (0.2-1.2%), 3 lignans, linarin (0.002-0.24%), and valepotriates were detected in all the valeriana species analyzed. Highest amounts of valerenic acids were detected in V. officinalis L., trace amounts in V. sitchensis, and none in the other species analyzed.

  12. Optimum thermal infrared bands for mapping general rock type and temperature from space

    NASA Technical Reports Server (NTRS)

    Holmes, Q. A.; Nuesch, D. R.

    1978-01-01

    A study was carried out to determine quantitatively the number and locations of spectral bands required to perform general rock-type discrimination from spaceborne imaging sensors using only thermal infrared measurements. Beginning with laboratory spectra collected under idealized conditions from relatively well characterized, homogeneous samples, a radiative transfer model was employed to transform ground exitance values into the corresponding spectral radiance at the top of the atmosphere. Taking sensor noise into account analysis of these data revealed that three 1 micrometer wide spectral bands would permit independent estimators of rock-type and sample temperature from a satellite infrared multispectral scanner. This study, indicates that the location of three spectral bands at 8.1-9.1 micrometers, 9.5-10.5 micrometers and 11.0-12.0 micrometers, and the employment of appropriate preprocessing to minimize atmospheric effects makes it possible to predict general rock-type and temperature for a variety of atmospheric states and temperatures.

  13. Simple procedures for enrichment of chlorinated aromatic pollutants from fat, water and milk for subsequent analysis by high-resolution methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egestad, B.; Curstedt, T.; Sjoevall, J.

    1982-01-01

    Procedures for enrichment of non-volatile chlorinated aromatic pollutants from fat, water and milk are described. /sup 14/C-DDT was used as a model compound in recovery experiments. A several thousand-fold enrichment of DDT added to butter was achieved by two consecutive straight-phase chromatographies on Lipidex 5000. Trace amounts of DDT in liter volumes of water could be quantitatively extracted by rapid filtration through 2 ml beds of Lipidex 1000. A batch extraction procedure permitted enrichment of DDT from milk after addition of n-pentylamine, methanol and water. DDT could then be eluted from the gel with retention of more than 90% ofmore » the lipids. A reversed-phase system with Lipidex 5000 could be used for separation of TCDD from DDT and PCBs. The liquid-gel chromatographic procedures are simple and suitable for clean-up of samples prior to application of high-resolution methods. 5 tables.« less

  14. Design of an impact evaluation using a mixed methods model--an explanatory assessment of the effects of results-based financing mechanisms on maternal healthcare services in Malawi.

    PubMed

    Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela

    2014-04-22

    In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.

  15. Application of Earth Resources Technology Satellite data to urban development and regional planning: Test site, County of Los Angeles

    NASA Technical Reports Server (NTRS)

    Raje, S. (Principal Investigator); Economy, R.; Mcknight, J. S.; Garofalo, P.

    1973-01-01

    The author has identified the following significant results. Signigicant results have been obtained from the analyses of ERTS-1 imagery from five cycles over Test Site SR 124 by classical photointerpretation and by an interactive hybrid multispectral information extraction system (GEMS). Photointerpretation has produced over 25 overlays at 1:1,000,000 scale depicting regional relations and urban structure in terms of several hundred linear and areal features. A possible new fault lineament has been discovered on the northern slope of the Santa Monica mountains. GEMS analysis of the ERTS-1 products has provided new or improved information in the following planning data categories: urban vegetation; land cover segregation; manmade and natural impact monitoring; urban design; land suitability. ERTS-1 data analysis has allowed planners to establish trends that directly impact planning policies. For example, detectable grading and new construction sites quantitatively indicated the extent, direction, and rate of urban expansion which enable planners to forecast demand and growth patterns on a regional scale. This new source of information will not only assist current methods to be more efficient, but permits entirely new planning methodologies to be employed.

  16. Quality engineering and control semiannual progress report, November--December 1977 and January--April 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, R.L.

    1978-10-10

    Determination of ppM concentrations of dibutyl phosphate in aqueous solutions of nitric acid or sodium carbonate is possible after an extraction step with carbon tetrachloride. Ester cleavage of the fatty triglyceride constituents is the principal radiolytic reaction in irradiation of a dolphin-head oil base precision watch oil. Use of a special container and a vacuum-brazing furnace permits the high-temperature dynamic determination of volatile surface contamination from samples of large geometric surface areas. Status of the mass spectrometer development project is discussed. A mass spectroscopy method provided rapid, specific, and quantitative results for environmental monitoring of polychlorinated biphenyl compounds. A sequentialmore » procedure for improved separation of actinides in soil samples was developed. Testing of packings in intimate contact with acid was accomplished by differential thermal analysis. Analysis results obtained on the XRD-6 x-ray unit are discussed for Nb in U-Nb alloys and for Cr and Ne in stainless steels. Correlation coefficients of 0.994 or better resulted for the determinations of Cr, Mn, and Ne in the stainless steel standards employed in the statistical evaluation of the Quanta Metrix x-ray unit.« less

  17. Bile Salt-induced Biofilm Formation in Enteric Pathogens: Techniques for Identification and Quantification.

    PubMed

    Nickerson, Kourtney P; Faherty, Christina S

    2018-05-06

    Biofilm formation is a dynamic, multistage process that occurs in bacteria under harsh environmental conditions or times of stress. For enteric pathogens, a significant stress response is induced during gastrointestinal transit and upon bile exposure, a normal component of human digestion. To overcome the bactericidal effects of bile, many enteric pathogens form a biofilm hypothesized to permit survival when transiting through the small intestine. Here we present methodologies to define biofilm formation through solid-phase adherence assays as well as extracellular polymeric substance (EPS) matrix detection and visualization. Furthermore, biofilm dispersion assessment is presented to mimic the analysis of events triggering release of bacteria during the infection process. Crystal violet staining is used to detect adherent bacteria in a high-throughput 96-well plate adherence assay. EPS production assessment is determined by two assays, namely microscopy staining of the EPS matrix and semi-quantitative analysis with a fluorescently-conjugated polysaccharide binding lectin. Finally, biofilm dispersion is measured through colony counts and plating. Positive data from multiple assays support the characterization of biofilms and can be utilized to identify bile salt-induced biofilm formation in other bacterial strains.

  18. Quantitative analysis of major elements in silicate minerals and glasses by micro-PIXE

    USGS Publications Warehouse

    Campbell, J.L.; Czamanske, G.K.; MacDonald, L.; Teesdale, W.J.

    1997-01-01

    The Guelph micro-PIXE facility has been modified to accommodate a second Si(Li) X-ray detector which records the spectrum due to light major elements (11 ??? Z ??? 20) with no deleterious effects from scattered 3 MeV protons. Spectra have been recorded from 30 well-characterized materials, including a broad range of silicate minerals and both natural and synthetic glasses. Sodium is mobile in some of the glasses, but not in the studied mineral lattices. The mean value of the instrumental constant H for each of the elements Mg, Al, and Si in these materials is systematically 6-8% lower than the H-value measured for the pure metals. Normalization factors are derived which permit the matrix corrections requisite for trace-element measurements in silicates to be based upon pure metal standards for Mg, Al and Si, supplemented by well-established, silicate mineral standards for the elements Na, K and Ca. Rigorous comparisons of electron microprobe and micro-PIXE analyses for the entire, 30-sample suite demonstrate the ability of micro-PIXE to produce accurate analysis for the light major elements in silicates. ?? 1997 Elsevier Science B.V.

  19. EEG power and coherence while male adults watch emotional video films.

    PubMed

    Schellberg, D; Besthorn, C; Klos, T; Gasser, T

    1990-10-01

    Quantitative EEG analysis recorded at F3, F4, T3, T4, P3, P4 was performed for a group of healthy right-handed male adults (n = 9) viewing video films varying in their inductiveness on the affective valence dimension. Digital EOG-correction permitted the inclusion of trials with eye movements. Muscle artifacts were statistically treated by means of analysis of covariance (ANCOVA). The configuration of topographically motivated EEG parameters corresponded to the subjective valence rating of different video films. Low broad band coherences (COHs) ranked films along the subjective ratings within each hemisphere by the fronto-temporal COHs and interhemispherically by the T4-T3 COH, as did, restricted to the right hemisphere, similarity of beta 2 band power topography over time. High frequencies may be involved in the processing and low frequencies in the transmission of differential affective information, which to integrate seemed to utilize resources of both hemispheres. Alpha 2 and beta 1 COHs were sensitive to variations in an integrality/disassociation dimension with regard to the arrangement of verbal-visual affective cues. Power fluctuations at frontal leads pointed to difficulties in interpreting interhemispheric EEG asymmetries in emotion research, if information on time dynamics is discarded.

  20. SFC-APLI-(TOF)MS: Hyphenation of Supercritical Fluid Chromatography to Atmospheric Pressure Laser Ionization Mass Spectrometry.

    PubMed

    Klink, Dennis; Schmitz, Oliver Johannes

    2016-01-05

    Atmospheric-pressure laser ionization mass spectrometry (APLI-MS) is a powerful method for the analysis of polycyclic aromatic hydrocarbon (PAH) molecules, which are ionized in a selective and highly sensitive way via resonance-enhanced multiphoton ionization. APLI was presented in 2005 and has been hyphenated successfully to chromatographic separation techniques like high performance liquid chromatography (HPLC) and gas chromatography (GC). In order to expand the portfolio of chromatographic couplings to APLI, a new hyphenation setup of APLI and supercritical-fluid chromatography (SFC) was constructed and aim of this work. Here, we demonstrate the first hyphenation of SFC and APLI in a simple designed way with respect to different optimization steps to ensure a sensitive analysis. The new setup permits qualitative and quantitative determination of native and also more polar PAH molecules. As a result of the altered ambient characteristics within the source enclosure, the quantification of 1-hydroxypyrene (1-HP) in human urine is possible without prior derivatization. The limit of detection for 1-HP by SFC-APLI-TOF(MS) was found to be 0.5 μg L(-1), which is lower than the 1-HP concentrations found in exposed persons.

  1. Determination of H2O and CO2 concentrations in fluid inclusions in minerals using laser decrepitation and capacitance manometer analysis

    NASA Technical Reports Server (NTRS)

    Yonover, R. N.; Bourcier, W. L.; Gibson, E. K.

    1985-01-01

    Water and carbon dioxide concentrations within individual and selected groups of fluid inclusions in quartz were analyzed by using laser decrepitation and quantitative capacitance manometer determination. The useful limit of detection (calculated as ten times the typical background level) is about 5 x 10(-10) mol of H2O and 5 x 10(-11) mol of CO2; this H2O content translates into an aqueous fluid inclusion approximately 25 micrometers in diameter. CO2/H2O determinations for 38 samples (100 separate measurements) have a range of H2O amounts of 5.119 x 10(-9) to 1.261 x 10(-7) mol; CO2 amounts of 7.216 x 10(-10) to 1.488 x 10(-8) mol, and CO2/H2O mole ratios of 0.011 to 1.241. Replicate mole ratio determinations of CO2/H2O for three identical (?) clusters of inclusions in quartz have average mole ratios of 0.0305 +/- 0.0041 1 sigma. Our method offers much promise for analysis of individual fluid inclusions, is sensitive, is selective when the laser energy is not so great as to melt the mineral (laser pits approximately 50 micrometers in diameter), and permits rapid analysis (approximately 1 h per sample analysis).

  2. Oral pathology follow-up by means of micro-Raman spectroscopy on tissue and blood serum samples: an application of wavelet and multivariate data analysis

    NASA Astrophysics Data System (ADS)

    Delfino, I.; Camerlingo, C.; Zenone, F.; Perna, G.; Capozzi, V.; Cirillo, N.; Gaeta, G. M.; De Mol, E.; Lepore, M.

    2009-02-01

    Pemphigus vulgaris (PV) is a potentially fatal autoimmune disease that cause blistering of the skin and oral cavity. It is characterized by disruption of cell-cell adhesion within the suprabasal layers of epithelium, a phenomenon termed acantholysis Patients with PV develop IgG autoantibodies against normal constituents of the intercellular substance of keratinocytes. The mechanisms by which such autoantibodies induce blisters are not clearly understood. The qualitative analysis of such effects provides important clues in the search for a specific diagnosis, and the quantitative analysis of biochemical abnormalities is important in measuring the extent of the disease process, designing therapy and evaluating the efficacy of treatment. Improved diagnostic techniques could permit the recognition of more subtle forms of disease and reveal incipient lesions clinically unapparent, so that progression of potentially severe forms could be reversed with appropriate treatment. In this paper, we report the results of our micro-Raman spectroscopy study on tissue and blood serum samples from ill, recovered and under therapy PV patients. The complexity of the differences among their characteristic Raman spectra has required a specific strategy to obtain reliable information on the illness stage of the patients For this purpose, wavelet techniques and advanced multivariate analysis methods have been developed and applied to the experimental Raman spectra. Promising results have been obtained.

  3. A review of post-nuclear-catastrophe management

    NASA Astrophysics Data System (ADS)

    Nifenecker, Hervé

    2015-07-01

    The purpose of this paper is to make radioactive risk more generally understandable. To that end, we compare it to smoking tobacco. Further, we show that the concept of loss of life expectancy permits a quantitative comparison between various aggressions. The demystification of radioactive risk should lead to basic changes in post-catastrophe management, allowing victims to choose whether or not to leave contaminated areas. A less emotional appreciation of radioactive risks should lead to the adaptation of legal practices when dealing with probabilistic situations.

  4. An overview of contemporary nuclear cardiology.

    PubMed

    Lewin, Howard C; Sciammarella, Maria G; Watters, Thomas A; Alexander, Herbert G

    2004-01-01

    Myocardial perfusion single photon emission computed tomography (SPECT) is a widely utilized noninvasive imaging modality for the diagnosis, prognosis, and risk stratification of coronary artery disease. It is clearly superior to the traditional planar technique in terms of imaging contrast and consequent diagnostic and prognostic yield. The strength of SPECT images is largely derived from the three-dimensional, volumetric nature of its image. Thus, this modality permits three-dimensional assessment and quantitation of the perfused myocardium and functional assessment through electrocardiographic gating of the perfusion images.

  5. Rainbow Schlieren

    NASA Technical Reports Server (NTRS)

    Howes, W. L.

    1983-01-01

    The rainbow schlieren is an apparatus in which the usual schlieren knife edge cutoff is replaced by a radial rainbow filter with a transparent center and an opaque surround. With this apparatus most refractive index nonuniformities in the test section appear varicolored whereas uniformities appear white. The rainbow schlieren is simple, easy to use, and relatively inexpensive and gives much greater detail regarding nonuniformities than does the ordinary schlieren. Moreover, the rainbow schlieren permits quantitative evaluation of certain refractive index distributions, including those involving turbulence, by simple calculations.

  6. Real-time monitoring of volatile organic compounds using chemical ionization mass spectrometry

    DOEpatents

    Mowry, Curtis Dale; Thornberg, Steven Michael

    1999-01-01

    A system for on-line quantitative monitoring of volatile organic compounds (VOCs) includes pressure reduction means for carrying a gaseous sample from a first location to a measuring input location maintained at a low pressure, the system utilizing active feedback to keep both the vapor flow and pressure to a chemical ionization mode mass spectrometer constant. A multiple input manifold for VOC and gas distribution permits a combination of calibration gases or samples to be applied to the spectrometer.

  7. Sensitivity Study for Long Term Reliability

    NASA Technical Reports Server (NTRS)

    White, Allan L.

    2008-01-01

    This paper illustrates using Markov models to establish system and maintenance requirements for small electronic controllers where the goal is a high probability of continuous service for a long period of time. The system and maintenance items considered are quality of components, various degrees of simple redundancy, redundancy with reconfiguration, diagnostic levels, periodic maintenance, and preventive maintenance. Markov models permit a quantitative investigation with comparison and contrast. An element of special interest is the use of conditional probability to study the combination of imperfect diagnostics and periodic maintenance.

  8. Systems analysis of iron metabolism: the network of iron pools and fluxes

    PubMed Central

    2010-01-01

    Background Every cell of the mammalian organism needs iron as trace element in numerous oxido-reductive processes as well as for transport and storage of oxygen. The very versatility of ionic iron makes it a toxic entity which can catalyze the production of radicals that damage vital membranous and macromolecular assemblies in the cell. The mammalian organism maintains therefore a complex regulatory network of iron uptake, excretion and intra-body distribution. Intracellular regulation in different cell types is intertwined with a global hormonal signalling structure. Iron deficiency as well as excess of iron are frequent and serious human disorders. They can affect every cell, but also the organism as a whole. Results Here, we present a kinematic model of the dynamic system of iron pools and fluxes. It is based on ferrokinetic data and chemical measurements in C57BL6 wild-type mice maintained on iron-deficient, iron-adequate, or iron-loaded diet. The tracer iron levels in major tissues and organs (16 compartment) were followed for 28 days. The evaluation resulted in a whole-body model of fractional clearance rates. The analysis permits calculation of absolute flux rates in the steady-state, of iron distribution into different organs, of tracer-accessible pool sizes and of residence times of iron in the different compartments in response to three states of iron-repletion induced by the dietary regime. Conclusions This mathematical model presents a comprehensive physiological picture of mice under three different diets with varying iron contents. The quantitative results reflect systemic properties of iron metabolism: dynamic closedness, hierarchy of time scales, switch-over response and dynamics of iron storage in parenchymal organs. Therefore, we could assess which parameters will change under dietary perturbations and study in quantitative terms when those changes take place. PMID:20704761

  9. Implementation of a generic SFC-MS method for the quality control of potentially counterfeited medicinal cannabis with synthetic cannabinoids.

    PubMed

    Jambo, Hugues; Dispas, Amandine; Avohou, Hermane T; André, Sébastien; Hubert, Cédric; Lebrun, Pierre; Ziemons, Éric; Hubert, Philippe

    2018-06-05

    In this study, we describe the development of a SFC-MS method for the quality control of cannabis plants that could be potentially adulterated with synthetic cannabinoids. Considering the high number of already available synthetic cannabinoids and the high rate of development of novel structures, we aimed to develop a generic method suitable for the analysis of a large panel of substances using seventeen synthetic cannabinoids from multiple classes as model compounds. Firstly, a suitable column was chosen after a screening phase. Secondly, optimal operating conditions were obtained following a robust optimization strategy based on a design of experiments and design space methodology (DoE-DS). Finally, the quantitative performances of the method were assessed with a validation according to the total error approach. The developed method has a run time of 9.4 min. It uses a simple modifier composition of methanol with 2% H 2 O and requires minimal sample preparation. It can chromatographically separate natural cannabinoids (except THC-A and CBD-A) from the synthetics assessed. Also, the use of mass spectrometry provides sensitivity and specificity. Moreover, this quality by design (QbD) approach permits the tuning of the method (within the DS) during routine analysis to achieve a desirable separation since the future compounds that should be analyzed could be unknown. The method was validated for the quantitation of a selected synthetic cannabinoid in fiber-type cannabis matrix over the range of 2.5% - 7.5% (w/w) with LOD value as low as 14.4 ng/mL. This generic method should be easy to implement in customs or QC laboratories in the context of counterfeit drugs tracking. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. pH optimization for a reliable quantification of brain tumor cell and tissue extracts with (1)H NMR: focus on choline-containing compounds and taurine.

    PubMed

    Robert, O; Sabatier, J; Desoubzdanne, D; Lalande, J; Balayssac, S; Gilard, V; Martino, R; Malet-Martino, M

    2011-01-01

    The aim of this study was to define the optimal pH for (1)H nuclear magnetic resonance (NMR) spectroscopy analysis of perchloric acid or methanol-chloroform-water extracts from brain tumor cells and tissues. The systematic study of the proton chemical shift variations as a function of pH of 13 brain metabolites in model solutions demonstrated that recording (1)H NMR spectra at pH 10 allowed resolving resonances that are overlapped at pH 7, especially in the 3.2-3.3 ppm choline-containing-compounds region. (1)H NMR analysis of extracts at pH 7 or 10 showed that quantitative measurements of lactate, alanine, glutamate, glutamine (Gln), creatine + phosphocreatine and myo-inositol (m-Ino) can be readily performed at both pHs. The concentrations of glycerophosphocholine, phosphocholine and choline that are crucial metabolites for tumor brain malignancy grading were accurately measured at pH 10 only. Indeed, the resonances of their trimethylammonium moieties are cleared of any overlapping signal, especially those of taurine (Tau) and phosphoethanolamine. The four non-ionizable Tau protons resonating as a singlet in a non-congested spectral region permits an easier and more accurate quantitation of this apoptosis marker at pH 10 than at pH 7 where the triplet at 3.43 ppm can be overlapped with the signals of glucose or have an intensity too low to be measured. Glycine concentration was determined indirectly at both pHs after subtracting the contribution of the overlapped signals of m-Ino at pH 7 or Gln at pH 10.

  11. Accurate quantification of chromosomal lesions via short tandem repeat analysis using minimal amounts of DNA.

    PubMed

    Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian

    2017-09-01

    Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77-0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Flortaucipir tau PET imaging in semantic variant primary progressive aphasia.

    PubMed

    Makaretz, Sara J; Quimby, Megan; Collins, Jessica; Makris, Nikos; McGinnis, Scott; Schultz, Aaron; Vasdev, Neil; Johnson, Keith A; Dickerson, Bradford C

    2017-10-06

    The semantic variant of primary progressive aphasia (svPPA) is typically associated with frontotemporal lobar degeneration (FTLD) with longTAR DNA-binding protein (TDP)-43-positive neuropil threads and dystrophic neurites (type C), and is only rarely due to a primary tauopathy or Alzheimer's disease. We undertook this study to investigate the localisation and magnitude of the presumed tau Positron Emission Tomography (PET) tracer [ 18 F]Flortaucipir (FTP; also known as T807 or AV1451) in patients with svPPA, hypothesising that most patients would not show tracer uptake different from controls. FTP and [ 11 C]Pittsburgh compound B PET imaging as well as MRI were performed in seven patients with svPPA and in 20 controls. FTP signal was analysed by visual inspection and by quantitative comparison to controls, with and without partial volume correction. All seven patients showed elevated FTP uptake in the anterior temporal lobe with a leftward asymmetry that was not observed in healthy controls. This elevated FTP signal, largely co-localised with atrophy, was evident on both visual inspection and quantitative cortical surface-based analysis. Five patients were amyloid negative, one was amyloid positive and one has an unknown amyloid status. In this series of patients with clinical profiles, structural MRI and amyloid PET imaging typical for svPPA, FTP signal was unexpectedly elevated with a spatial pattern localised to areas of atrophy. This raises questions about the possible off-target binding of this tracer to non-tau molecules associated with neurodegeneration. Further investigation with autopsy analysis will help illuminate the binding target(s) of FTP in cases of suspected FTLD-TDP neuropathology. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Crater studies: Part A: lunar crater morphometry

    USGS Publications Warehouse

    Pike, Richard J.

    1973-01-01

    Morphometry, the quantitative study of shape, complements the visual observation and photointerpretation in analyzing the most outstanding landforms of the Moon, its craters (refs. 32-1 and 32-2). All three of these interpretative tools, which were developed throughout the long history of telescopic lunar study preceding the Apollo Program, will continue to be applicable to crater analysis until detailed field work becomes possible. Although no large (>17.5 km diameter) craters were examined in situ on any of the Apollo landings, the photographs acquired from the command modules will markedly strengthen results of less direct investigations of the craters. For morphometry, the most useful materials are the orbital metric and panoramic photographs from the final three Apollo missions. These photographs permit preparation of contour maps, topographic profiles, and other numerical data that accurately portray for the first time the surface geometry of lunar craters of all sizes. Interpretations of craters no longer need be compromised by inadequate topographic data. In the pre-Apollo era, hypotheses for the genesis of lunar craters usually were constructed without any numerical descriptive data. Such speculations will have little credibility unless supported by accurate, quantitative data, especially those generated from Apollo orbital photographs. This paper presents a general study of the surface geometry of 25 far-side craters and a more detailed study of rim-crest evenness for 15 near-side and far-side craters. Analysis of this preliminary sample of Apollo 15 and 17 data, which includes craters between 1.5 and 275 km in diameter, suggests that most genetic interpretations of craters made from pre-Apollo topographic measurements may require no drastic revision. All measurements were made from topographic profiles generated on a stereoplotter at the Photogrammetric Unit of the U.S. Geological Survey, Center of Astrogeology, Flagstaff, Arizona.

  14. Spectrometer gun

    DOEpatents

    Waechter, David A.; Wolf, Michael A.; Umbarger, C. John

    1985-01-01

    A hand-holdable, battery-operated, microprocessor-based spectrometer gun includes a low-power matrix display and sufficient memory to permit both real-time observation and extended analysis of detected radiation pulses. Universality of the incorporated signal processing circuitry permits operation with various detectors having differing pulse detection and sensitivity parameters.

  15. Localisation and semi-quantitative measurement of lipocortin 1 in rat anterior pituitary cells by fluorescence-activated cell analysis/sorting and electron microscopy.

    PubMed

    Christian, H C; Flower, R J; Morris, J F; Buckingham, J C

    1999-09-01

    Lipocortin 1 (LC1, also called annexin 1), a Ca2(+)- and phospholipid-binding protein, is an important mediator of glucocorticoid action in the anterior pituitary gland. Previous studies based on immunoprecipitation and Western blot analysis suggest that LC1 is found intracellularly both in the cytoplasm and in association with membranes and also on the cell surface where it attaches to the membrane by a Ca2(+)-dependent mechanism. However, as yet it is unclear which anterior pituitary cell types express the protein. Accordingly, we have developed a method based on a combination of fluorescence activated cell (FAC) analysis/sorting and electron microscopy to detect and quantify intracellular LC1 in rat anterior pituitary cells and to identify the cell types in which it is expressed. In addition, we have measured cell surface LC1 and examined the influence of glucocorticoids on the cellular disposition of the protein. Anterior pituitary cells were dispersed with collagenase. For experiments measuring intracellular LC1, three cell fixation/permeabilisation methods were examined initially, i.e. (1) Zamboni's fluid (30 min) and Triton-X-100 (0.12%, 1 or 12 h); (2) paraformaldehyde (2%, 1 h) and Triton-X-100 (0.2%, 10 min); and (3) paraformaldehyde (0.2%, 15 min) and saponin (0.1%, 5 min). The protocol using paraformaldehyde/Triton-X-100 provided optimal preservation of cell ultrastructure and of LC1 immunoreactivity (ir-LC1) while also effectively permeabilising the cells; it was therefore used in subsequent studies. Using an anti-LC1 monoclonal antibody as a probe, 82+/-5% of the secretory cells in the heterogeneous anterior pituitary cell preparation were shown by FAC analysis to display specific fluorescence for intracellular ir-LC1. Morphological analysis and immunogold-histochemistry of cells separated by FAC sorting identified corticotrophs, lactotrophs, somatotrophs and gonadotrophs in the population displaying LC1 immunofluorescence. LC1 was also detected on the surface of anterior pituitary cells by FACS analysis. Incubation of anterior pituitary cells with dexamethasone or corticosterone (0.1 and 1.0 microM) prior to fixation and analysis produced a significant, concentration-dependent decrease in intracellular ir-LC1 and a concomitant increase in the amount of ir-LC1 detected on the surface of the cells; the effects of the two steroids were indistinguishable quantitatively. In conclusion, we report a novel method which permits (1) the detection and semi-quantitative measurement of intracellular and surface LC1 in anterior pituitary cells; and (2) the identification of the cell types in which the protein is found.

  16. A Charge Coupled Device Imaging System For Ophthalmology

    NASA Astrophysics Data System (ADS)

    Rowe, R. Wanda; Packer, Samuel; Rosen, James; Bizais, Yves

    1984-06-01

    A digital camera system has been constructed for obtaining reflectance images of the fundus of the eye with monochromatic light. Images at wavelengths in the visible and near infrared regions of the spectrum are recorded by a charge-coupled device array and transferred to a computer. A variety of image processing operations are performed to restore the pictures, correct for distortions in the image formation process, and extract new and diagnostically useful information. The steps involved in calibrating the system to permit quantitative measurement of fundus reflectance are discussed. Three clinically important applications of such a quantitative system are addressed: the characterization of changes in the optic nerve arising from glaucoma, the diagnosis of choroidal melanoma through spectral signatures, and the early detection and improved management of diabetic retinopathy by measurement of retinal tissue oxygen saturation.

  17. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  18. Quantitative NDE applied to composites and metals

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.; Parker, F. Raymond; Heath, D. Michele; Welch, Christopher S.

    1989-01-01

    Research at the NASA/Langley Research Center concerning quantitative NDE of composites and metals is reviewed. The relationship between ultrasonics and polymer cure is outlined. NDE models are presented, which can be used to develop measurement technologies for characterizing the curing of a polymer system for composite materials. The models can be used to determine the glass transition temperature, the degree of cure, and the cure rate. The application of the model to control autoclave processing of composite materials is noted. Consideration is given to the use of thermal diffusion models combined with controlled thermal input measurements to determine the thermal diffusivity of materials. Also, a two-dimensional physical model is described that permits delaminations in samples of Space Shuttle Solid Rocket Motors to be detected in thermograms in the presence of cooling effects and uneven heating.

  19. 15 CFR 971.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... If the permit area lies within the area of NOAA's Deep Ocean Mining Environmental Study (DOMES), the... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS... Administrator to prepare an environmental impact statement (EIS) on the proposed mining activities, and to...

  20. 15 CFR 971.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... If the permit area lies within the area of NOAA's Deep Ocean Mining Environmental Study (DOMES), the... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS... Administrator to prepare an environmental impact statement (EIS) on the proposed mining activities, and to...

  1. 15 CFR 971.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... If the permit area lies within the area of NOAA's Deep Ocean Mining Environmental Study (DOMES), the... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS... Administrator to prepare an environmental impact statement (EIS) on the proposed mining activities, and to...

  2. 15 CFR 971.204 - Environmental and use conflict analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... If the permit area lies within the area of NOAA's Deep Ocean Mining Environmental Study (DOMES), the... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS... Administrator to prepare an environmental impact statement (EIS) on the proposed mining activities, and to...

  3. Spectrometer gun

    DOEpatents

    Waechter, D.A.; Wolf, M.A.; Umbarger, C.J.

    1981-11-03

    A hand-holdable, battery-operated, microprocessor-based spectrometer gun is described that includes a low-power matrix display and sufficient memory to permit both real-time observation and extended analysis of detected radiation pulses. Universality of the incorporated signal processing circuitry permits operation with various detectors having differing pulse detection and sensitivity parameters.

  4. Image-guided convection-enhanced delivery of muscimol to the primate brain

    PubMed Central

    Heiss, John D.; Walbridge, Stuart; Asthagiri, Ashok R.; Lonser, Russell R.

    2009-01-01

    Object Muscimol is a potent γ-aminobutyric acid-A receptor agonist (GABAA) that temporarily and selectively suppresses neurons. Targeted muscimol-suppression of neuronal structures could provide insight into the pathophysiology and treatment of a variety of neurologic disorders. To determine if muscimol delivered to the brain by convection-enhanced delivery (CED) could be monitored using a co-infused surrogate magnetic resonance (MR)-imaging tracer, we perfused the striata of primates with tritiated muscimol and gadolinium-DTPA. Methods Three primates underwent convective co-infusion of 3H-muscimol (0.8 μM) and gadolinium-DTPA (−5 mM) into the bilateral striata. Primates underwent serial MR-imaging during infusion and animals were sacrificed immediately after infusion. Post-mortem quantitative autoradiography and histological analysis was performed. Results MR-imaging revealed that infusate (tritiated muscimol and gadolinium-DTPA) distribution was clearly discernible from the non-infused parenchyma. Real-time MR-imaging of the infusion revealed the precise region of anatomic perfusion in each animal. Imaging analysis during infusion revealed that the distribution volume of infusate linearly increased (R=0.92) with volume of infusion. Overall, the mean (±S.D.) volume of distribution to volume of infusion ratio was 8.2±1.3. Autoradiographic analysis revealed that MR-imaging of gadolinium-DTPA closely correlated with the distribution of 3H-muscimol and precisely estimated its volume of distribution (mean difference in volume of distribution, 7.4%). Quantitative autoradiograms revealed that muscimol was homogeneously distributed over the perfused region in a square-shaped concentration profile. Conclusions Muscimol can be effectively delivered to clinically relevant volumes of the primate brain. Moreover, the distribution of muscimol can be tracked by co-infusion of gadolinium-DTPA using MR-imaging. The ability to accurately monitor and control the anatomic extent of muscimol distribution during its convection-enhanced delivery will enhance safety, permit correlations of muscimol distribution with clinical effect, and should lead to an improved understanding of the pathophysiologic processes underlying a variety of neurologic disorders. PMID:19715424

  5. Computable visually observed phenotype ontological framework for plants

    PubMed Central

    2011-01-01

    Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966

  6. Personalized Medicine and Opioid Analgesic Prescribing for Chronic Pain: Opportunities and Challenges

    PubMed Central

    Bruehl, Stephen; Apkarian, A. Vania; Ballantyne, Jane C.; Berger, Ann; Borsook, David; Chen, Wen G.; Farrar, John T.; Haythornthwaite, Jennifer A.; Horn, Susan D.; Iadarola, Michael J.; Inturrisi, Charles E.; Lao, Lixing; Mackey, Sean; Mao, Jianren; Sawczuk, Andrea; Uhl, George R.; Witter, James; Woolf, Clifford J.; Zubieta, Jon-Kar; Lin, Yu

    2013-01-01

    Use of opioid analgesics for pain management has increased dramatically over the past decade, with corresponding increases in negative sequelae including overdose and death. There is currently no well-validated objective means of accurately identifying patients likely to experience good analgesia with low side effects and abuse risk prior to initiating opioid therapy. This paper discusses the concept of data-based personalized prescribing of opioid analgesics as a means to achieve this goal. Strengths, weaknesses, and potential synergism of traditional randomized placebo-controlled trial (RCT) and practice-based evidence (PBE) methodologies as means to acquire the clinical data necessary to develop validated personalized analgesic prescribing algorithms are overviewed. Several predictive factors that might be incorporated into such algorithms are briefly discussed, including genetic factors, differences in brain structure and function, differences in neurotransmitter pathways, and patient phenotypic variables such as negative affect, sex, and pain sensitivity. Currently available research is insufficient to inform development of quantitative analgesic prescribing algorithms. However, responder subtype analyses made practical by the large numbers of chronic pain patients in proposed collaborative PBE pain registries, in conjunction with follow-up validation RCTs, may eventually permit development of clinically useful analgesic prescribing algorithms. Perspective Current research is insufficient to base opioid analgesic prescribing on patient characteristics. Collaborative PBE studies in large, diverse pain patient samples in conjunction with follow-up RCTs may permit development of quantitative analgesic prescribing algorithms which could optimize opioid analgesic effectiveness, and mitigate risks of opioid-related abuse and mortality. PMID:23374939

  7. Trimethylation enhancement using diazomethane (TrEnDi): rapid on-column quaternization of peptide amino groups via reaction with diazomethane significantly enhances sensitivity in mass spectrometry analyses via a fixed, permanent positive charge.

    PubMed

    Wasslen, Karl V; Tan, Le Hoa; Manthorpe, Jeffrey M; Smith, Jeffrey C

    2014-04-01

    Defining cellular processes relies heavily on elucidating the temporal dynamics of proteins. To this end, mass spectrometry (MS) is an extremely valuable tool; different MS-based quantitative proteomics strategies have emerged to map protein dynamics over the course of stimuli. Herein, we disclose our novel MS-based quantitative proteomics strategy with unique analytical characteristics. By passing ethereal diazomethane over peptides on strong cation exchange resin within a microfluidic device, peptides react to contain fixed, permanent positive charges. Modified peptides display improved ionization characteristics and dissociate via tandem mass spectrometry (MS(2)) to form strong a2 fragment ion peaks. Process optimization and determination of reactive functional groups enabled a priori prediction of MS(2) fragmentation patterns for modified peptides. The strategy was tested on digested bovine serum albumin (BSA) and successfully quantified a peptide that was not observable prior to modification. Our method ionizes peptides regardless of proton affinity, thus decreasing ion suppression and permitting predictable multiple reaction monitoring (MRM)-based quantitation with improved sensitivity.

  8. Power and sample-size estimation for microbiome studies using pairwise distances and PERMANOVA

    PubMed Central

    Kelly, Brendan J.; Gross, Robert; Bittinger, Kyle; Sherrill-Mix, Scott; Lewis, James D.; Collman, Ronald G.; Bushman, Frederic D.; Li, Hongzhe

    2015-01-01

    Motivation: The variation in community composition between microbiome samples, termed beta diversity, can be measured by pairwise distance based on either presence–absence or quantitative species abundance data. PERMANOVA, a permutation-based extension of multivariate analysis of variance to a matrix of pairwise distances, partitions within-group and between-group distances to permit assessment of the effect of an exposure or intervention (grouping factor) upon the sampled microbiome. Within-group distance and exposure/intervention effect size must be accurately modeled to estimate statistical power for a microbiome study that will be analyzed with pairwise distances and PERMANOVA. Results: We present a framework for PERMANOVA power estimation tailored to marker-gene microbiome studies that will be analyzed by pairwise distances, which includes: (i) a novel method for distance matrix simulation that permits modeling of within-group pairwise distances according to pre-specified population parameters; (ii) a method to incorporate effects of different sizes within the simulated distance matrix; (iii) a simulation-based method for estimating PERMANOVA power from simulated distance matrices; and (iv) an R statistical software package that implements the above. Matrices of pairwise distances can be efficiently simulated to satisfy the triangle inequality and incorporate group-level effects, which are quantified by the adjusted coefficient of determination, omega-squared (ω2). From simulated distance matrices, available PERMANOVA power or necessary sample size can be estimated for a planned microbiome study. Availability and implementation: http://github.com/brendankelly/micropower. Contact: brendank@mail.med.upenn.edu or hongzhe@upenn.edu PMID:25819674

  9. Closing the door on pharma? A national survey of family medicine residencies regarding industry interactions.

    PubMed

    Fugh-Berman, Adriane; Brown, Steven R; Trippett, Rachel; Bell, Alicia M; Clark, Paige; Fleg, Anthony; Siwek, Jay

    2011-05-01

    To assess the extent and type of interactions U.S. family medicine residencies permit industry to have with medical students and residents. In 2008, the authors e-mailed a four-question survey to residency directors or coordinators at all 460 accredited U.S. family medicine residencies concerning the types of industry support and interaction permitted. The authors conducted quantitative and qualitative analyses of survey responses and written comments. Residencies that did not permit any industry food, gifts, samples, or support of residency activities were designated "pharma-free." The survey response rate was 62.2% (286/460). Among responding family medicine residencies, 52.1% refused drug samples, 48.6% disallowed industry gifts or food, 68.5% forbade industry-sponsored residency activities, and 44.1% denied industry access to students and residents at the family medicine center. Seventy-five residencies (26.2%) were designated as "pharma-free." Medical-school-based and medical-school-administered residencies were no more likely than community-based residencies to be pharma-free. Among the 211 programs that permitted interaction, 68.7% allowed gifts or food, 61.1% accepted drug samples, 71.1% allowed industry representatives access to trainees in the family medicine center, and 37.9% allowed industry-sponsored residency activities. Respondents commented on challenges inherent to limiting industry interactions. Many programs noted recent changes in plans or practices. Most family medicine residencies limit industry interaction with trainees. Because industry interactions can have adverse effects on rational prescribing, residency programs should assess the benefits and harms of these relationships. Copyright © by the Association of American medical Colleges.

  10. 7 CFR 301.80-4 - Issuance and cancellation of certificates and permits.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... this subpart. (d) Scientific permits to allow the interstate movement of regulated articles, and... may be issued for any regulated articles (except soil samples for processing, testing, or analysis) by... destination under all Federal domestic plant quarantines applicable to such articles and: (1) Have originated...

  11. 7 CFR 301.80-4 - Issuance and cancellation of certificates and permits.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... this subpart. (d) Scientific permits to allow the interstate movement of regulated articles, and... may be issued for any regulated articles (except soil samples for processing, testing, or analysis) by... destination under all Federal domestic plant quarantines applicable to such articles and: (1) Have originated...

  12. 75 FR 54117 - Building Energy Standards Program: Preliminary Determination Regarding Energy Efficiency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...

  13. Air Permitting Implications of a Biorefinery Producing Raw Bio-Oil in Comparison with Producing Gasoline and Diesel Blendstocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatt, Arpit H; Zhang, Yi Min

    A biorefinery, considered a chemical process plant under the Clean Air Act permitting program, could be classified as a major or minor source based on the size of the facility and magnitude of regulated pollutants emitted. Our previous analysis indicates that a biorefinery using fast pyrolysis conversion process to produce finished gasoline and diesel blendstocks with a capacity of processing 2,000 dry metric tons of biomass per day would likely be classified as a major source because several regulated pollutants (such as particulate matter, sulfur dioxide, nitrogen oxide) are estimated to exceed the 100 tons per year (tpy) major sourcemore » threshold, applicable to chemical process plants. Being subject to a major source classification could pose additional challenges associated with obtaining an air permit in a timely manner before the biorefinery can start its construction. Recent developments propose an alternative approach to utilize bio-oil produced via the fast pyrolysis conversion process by shipping it to an existing petroleum refinery, where the raw bio-oil can be blended with petroleum-based feedstocks (e.g., vacuum gas oil) to produce gasoline and diesel blendstocks with renewable content. Without having to hydro-treat raw bio-oil, a biorefinery is likely to reduce its potential-to-emit to below the 100 tpy major source threshold, and therefore expedite its permitting process. We compare the PTE estimates for the two biorefinery designs with and without hydrotreating of bio-oils and examine the air permitting implications on potential air permit classification and discuss the best available control technology requirements for the major source biorefinery utilizing hydrotreating operation. Our analysis is expected to provide useful information to new biofuel project developers to identify opportunities to overcome challenges associated with air permitting.« less

  14. APPLICATION OF RADIOISOTOPES TO THE QUANTITATIVE CHROMATOGRAPHY OF FATTY ACIDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budzynski, A.Z.; Zubrzycki, Z.J.; Campbell, I.G.

    1959-10-31

    The paper reports work done on the use of I/sup 131/, Zn/sup 65/, Sr/sup 90/, Zr/sup 95/, Ce/sup 144/ for the quantitative estimation of fatty acids on paper chromatograms, and for determination of the degree of usaturation of components of resolved fatty acid mixtures. I/sup 131/ is used to iodinate unsaturated fatty acids, and the amount of such acids is determined from the radiochromatogram. The degree of unsaturation of fatty acids is determined by estimation of the specific activiiy of spots. The other isotopes have been examined from the point of view of their suitability for estimation of total amountsmore » of fatty acids by formation of insoluble radioactive soaps held on the chromatogram. In particular, work is reported on the quantitative estimation of saturated fatty acids by measurement of the activity of their insoluble soaps with radioactive metals. Various quantitative relationships are described between amount of fatty acid in spot and such parameters as radiometrically estimated spot length, width, maximum intensity, and integrated spot activity. A convenient detection apparatus for taking radiochromatograms is also described. In conjunction with conventional chromatographic methods for resolving fatty acids the method permits the estimation of composition of fatty acid mixtures obtained from biological material. (auth)« less

  15. Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations

    NASA Astrophysics Data System (ADS)

    Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara

    2017-11-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.

  16. The other side of the broken window: a methodology that translates building permits into an ecometric of investment by community members.

    PubMed

    O'Brien, Daniel Tumminelli; Montgomery, Barrett W

    2015-03-01

    Much research has focused on physical disorder in urban neighborhoods as evidence that the community does not maintain local norms and spaces. Little attention has been paid to the opposite: indicators of proactive investment in the neighborhood's upkeep. This manuscript presents a methodology that translates a database of approved building permits into an ecometric of investment by community members, establishing basic content, criteria for reliability, and construct validity. A database from Boston, MA contained 150,493 permits spanning 2.5 years, each record including the property to be modified, permit type, and date issued. Investment was operationalized as the proportion of properties in a census block group that underwent an addition or renovation, excluding larger developments involving the demolition or construction of a building. The reliability analysis found that robust measures could be generated every 6 months, and that longitudinal analysis could differentiate between trajectories across neighborhoods. The validity analysis supported two hypotheses: investment was best predicted by homeownership and median income; and maintained an independent relationship with measures of physical disorder despite controlling for demographics, implying that it captures the other end of a spectrum of neighborhood maintenance. Possible uses for the measure in research and policy are discussed.

  17. Requiring Pollutant Discharge Permits for Pesticide Applications that Deposit Residues in Surface Waters

    PubMed Central

    Centner, Terence; Eberhart, Nicholas

    2014-01-01

    Agricultural producers and public health authorities apply pesticides to control pests that damage crops and carry diseases. Due to the toxic nature of most pesticides, they are regulated by governments. Regulatory provisions require pesticides to be registered and restrictions operate to safeguard human health and the environment. Yet pesticides used near surface waters pose dangers to non-target species and drinking water supplies leading some governments to regulate discharges of pesticides under pollution discharge permits. The dual registration and discharge permitting provisions are burdensome. In the United States, agricultural interest groups are advancing new legislation that would exempt pesticide residues from water permitting requirements. An analysis of the dangers posed by pesticide residues in drinking water leads to a conclusion that both pesticide registration and pollutant discharge permitting provisions are needed to protect human health and aquatic species. PMID:24814945

  18. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  19. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  20. Species Determination and Quantitation in Mixtures Using MRM Mass Spectrometry of Peptides Applied to Meat Authentication

    PubMed Central

    Gunning, Yvonne; Watson, Andrew D.; Rigby, Neil M.; Philo, Mark; Peazer, Joshua K.; Kemsley, E. Kate

    2016-01-01

    We describe a simple protocol for identifying and quantifying the two components in binary mixtures of species possessing one or more similar proteins. Central to the method is the identification of 'corresponding proteins' in the species of interest, in other words proteins that are nominally the same but possess species-specific sequence differences. When subject to proteolysis, corresponding proteins will give rise to some peptides which are likewise similar but with species-specific variants. These are 'corresponding peptides'. Species-specific peptides can be used as markers for species determination, while pairs of corresponding peptides permit relative quantitation of two species in a mixture. The peptides are detected using multiple reaction monitoring (MRM) mass spectrometry, a highly specific technique that enables peptide-based species determination even in complex systems. In addition, the ratio of MRM peak areas deriving from corresponding peptides supports relative quantitation. Since corresponding proteins and peptides will, in the main, behave similarly in both processing and in experimental extraction and sample preparation, the relative quantitation should remain comparatively robust. In addition, this approach does not need the standards and calibrations required by absolute quantitation methods. The protocol is described in the context of red meats, which have convenient corresponding proteins in the form of their respective myoglobins. This application is relevant to food fraud detection: the method can detect 1% weight for weight of horse meat in beef. The corresponding protein, corresponding peptide (CPCP) relative quantitation using MRM peak area ratios gives good estimates of the weight for weight composition of a horse plus beef mixture. PMID:27685654

  1. Species Determination and Quantitation in Mixtures Using MRM Mass Spectrometry of Peptides Applied to Meat Authentication.

    PubMed

    Gunning, Yvonne; Watson, Andrew D; Rigby, Neil M; Philo, Mark; Peazer, Joshua K; Kemsley, E Kate

    2016-09-20

    We describe a simple protocol for identifying and quantifying the two components in binary mixtures of species possessing one or more similar proteins. Central to the method is the identification of 'corresponding proteins' in the species of interest, in other words proteins that are nominally the same but possess species-specific sequence differences. When subject to proteolysis, corresponding proteins will give rise to some peptides which are likewise similar but with species-specific variants. These are 'corresponding peptides'. Species-specific peptides can be used as markers for species determination, while pairs of corresponding peptides permit relative quantitation of two species in a mixture. The peptides are detected using multiple reaction monitoring (MRM) mass spectrometry, a highly specific technique that enables peptide-based species determination even in complex systems. In addition, the ratio of MRM peak areas deriving from corresponding peptides supports relative quantitation. Since corresponding proteins and peptides will, in the main, behave similarly in both processing and in experimental extraction and sample preparation, the relative quantitation should remain comparatively robust. In addition, this approach does not need the standards and calibrations required by absolute quantitation methods. The protocol is described in the context of red meats, which have convenient corresponding proteins in the form of their respective myoglobins. This application is relevant to food fraud detection: the method can detect 1% weight for weight of horse meat in beef. The corresponding protein, corresponding peptide (CPCP) relative quantitation using MRM peak area ratios gives good estimates of the weight for weight composition of a horse plus beef mixture.

  2. What Can Be Learned from Nuclear Resonance Vibrational Spectroscopy: Vibrational Dynamics and Hemes

    PubMed Central

    2017-01-01

    Nuclear resonance vibrational spectroscopy (NRVS; also known as nuclear inelastic scattering, NIS) is a synchrotron-based method that reveals the full spectrum of vibrational dynamics for Mössbauer nuclei. Another major advantage, in addition to its completeness (no arbitrary optical selection rules), is the unique selectivity of NRVS. The basics of this recently developed technique are first introduced with descriptions of the experimental requirements and data analysis including the details of mode assignments. We discuss the use of NRVS to probe 57Fe at the center of heme and heme protein derivatives yielding the vibrational density of states for the iron. The application to derivatives with diatomic ligands (O2, NO, CO, CN–) shows the strong capabilities of identifying mode character. The availability of the complete vibrational spectrum of iron allows the identification of modes not available by other techniques. This permits the correlation of frequency with other physical properties. A significant example is the correlation we find between the Fe–Im stretch in six-coordinate Fe(XO) hemes and the trans Fe–N(Im) bond distance, not possible previously. NRVS also provides uniquely quantitative insight into the dynamics of the iron. For example, it provides a model-independent means of characterizing the strength of iron coordination. Prediction of the temperature-dependent mean-squared displacement from NRVS measurements yields a vibrational “baseline” for Fe dynamics that can be compared with results from techniques that probe longer time scales to yield quantitative insights into additional dynamical processes. PMID:28921972

  3. Short-term techniques for monitoring coral reefs: Review, results, and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, G.S.; Hunte, W.

    1994-12-31

    The health of coral reefs is in question on a global scale. The degradation of reefs has been attributed to both natural (e.g., el nino, crown-of-thorns, and hurricanes) and anthropogenic (e.g., sedimentation, nutrient overloading, oil spills, and thermal pollution) factors. Demonstrating the deleterious effects of lethal factors has not been difficult. However, it has been more difficult to quantitatively link those factors which do not cause rapid coral mortality to reef degradation. Classic techniques, such as cross-transplantation and x-ray analysis of growth bands, have proven to be successful bioassessments of chronic exposure to stressful conditions. The resolution of these techniquesmore » generally limits their usefulness as only long-term exposure (months to years) can provide quantitative differences between impacted and controlled conditions. Short-term monitoring techniques using corals have received relatively little attention from researchers. Two short-term methods have been successfully used to discriminated polluted from less-polluted sites in Barbados. The first is based on adult growth in several coral species. The second focuses on growth and survival of newly-settled juvenile corals. Both methods allowed discrimination in less than two weeks. These methods and others need to be evaluated and standardized in order to permit better, more efficient monitoring of the worlds reefs. Recommendations will be made on what life-history characteristics should be considered when choosing a coral species for use in bioassessment studies.« less

  4. Variation in amino acid and lipid composition of latent fingerprints.

    PubMed

    Croxton, Ruth S; Baron, Mark G; Butler, David; Kent, Terry; Sears, Vaughn G

    2010-06-15

    The enhancement of latent fingerprints, both at the crime scene and in the laboratory using an array of chemical, physical and optical techniques, permits their use for identification. Despite the plethora of techniques available, there are occasions when latent fingerprints are not successfully enhanced. An understanding of latent fingerprint chemistry and behaviour will aid the improvement of current techniques and the development of novel ones. In this study the amino acid and fatty acid content of 'real' latent fingerprints collected on a non-porous surface was analysed by gas chromatography-mass spectrometry. Squalene was also quantified in addition. Hexadecanoic acid, octadecanoic acid and cis-9-octadecenoic acid were the most abundant fatty acids in all samples. There was, however, wide variation in the relative amounts of each fatty acid in each sample. It was clearly demonstrated that touching sebum-rich areas of the face immediately prior to fingerprint deposition resulted in a significant increase in the amount of fatty acids and squalene deposited in the resulting 'groomed' fingerprints. Serine was the most abundant amino acid identified followed by glycine, alanine and aspartic acid. The significant quantitative differences between the 'natural' and 'groomed' fingerprint samples seen for fatty acids were not observed in the case of the amino acids. This study demonstrates the variation in latent fingerprint composition between individuals and the impact of the sampling protocol on the quantitative analysis of fingerprints. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Technical overview of the millimeter-wave imaging reflectometer on the DIII-D tokamak (invited)

    DOE PAGES

    Muscatello, Christopher M.; Domier, Calvin W.; Hu, Xing; ...

    2014-07-22

    The two-dimensional mm-wave imaging reflectometer (MIR) on DIII-D is a multi-faceted device for diagnosing electron density fluctuations in fusion plasmas. Its multi-channel, multi-frequency capabilities and high sensitivity permit visualization and quantitative diagnosis of density perturbations, including correlation length, wavenumber, mode propagation velocity, and dispersion. The two-dimensional capabilities of MIR are made possible with twelve vertically separated sightlines and four-frequency operation (corresponding to four radial channels). The 48-channel DIII-D MIR system has a tunable source that can be stepped in 500 µs increments over a range of 56 to 74 GHz. An innovative optical design keeps both on-axis and off-axis channelsmore » focused at the cutoff surface, permitting imaging over an extended poloidal region. As a result, the integrity of the MIR optical design is confirmed by comparing Gaussian beam calculations to laboratory measurements of the transmitter beam pattern and receiver antenna patterns.« less

  6. [Current radionuclear methods in the diagnosis of regional myocardial circulation disorders].

    PubMed

    Felix, R; Winkler, C

    1977-01-29

    Among nuclear medical diagnostic procedures a distinction can be made between non-invasive and invasive methods. The non-invasive methods serve either to image the still viable myocardium ("cold spot" technique) or for direct visualization of recently infarcted myocardial tissue ("hot spot" technique). These methods have the advantage of simple handling and good reproducibility. Side effects and risks are thus far unknown. Improvement of local dissolution should be aimed at in the future and wound greatly increase diagnostic and topographic security. The invasive procedures always require catheterization of the coronary arteries. This is the reason why they can be performed only with coronary arteriography. The Xenon "wash out" technique permits, with some restrictions, quantitative measurement of the regional flow rate. The "inflow technique" permits determination of perfusion distribution. The possibilities of the "double-radionuclide" scintigramm are discussed. For measurement of activity distribution, sationary detectors are generally preferred. In the case of the time-activity curves with the Xenon "wash out" technique, single detectors offer certain advantages.

  7. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  8. An analysis of the accessibility of video lottery terminals: the case of Montréal.

    PubMed

    Robitaille, Eric; Herjean, Patrick

    2008-01-18

    Researchers and public health officials in Canada, the United States and Australia have for some time noted broader geographic accessibility to gambling establishments, above all in socioeconomically underprivileged communities. This increase in availability could lead to more and more gambling problems. This article focuses, in an ecological perspective, in particular on a spatial analysis of the geographic accessibility of sites possessing a VLT permit in the Montréal area, i.e. Montréal Island, the South Shore and Laval, from the standpoint of the development of an indicator of the vulnerability (socioeconomic components and demographic components) to gambling of populations at the level of certain neighbourhood units (dissemination areas). With the recent development of geographic information systems (GIS), it is now possible to ascertain accessibility to services much more accurately, for example by taking into account the configuration of the road network. The findings of our analysis reveal widespread geographic accessibility to sites possessing a VLT permit in the downtown area and in pericentral districts. In some neighbourhood units, a site possessing a VLT permit may be within a three-minute walk. In the region studied overall, average walking time to a VLT site is nine minutes. Access to this type of service on foot is usually limited in the outskirts. However, a number of groups of sites possessing VLT permits are found along certain axial highways. According to local spatial self-correlation analyses, the findings suggest a significant link between walking accessibility to sites possessing VLT permits and the vulnerability of the communities. In a number of neighbourhood units with ready access to VLT's the populations display high vulnerability. These findings reveal that accessibility to sites possessing a VLT permit is often linked to the vulnerability (socioeconomic and demographic components) of communities. Reliance in our analyses on neighbourhood units with fairly small areas enabled us to emphasize the rectilinear dimension of the spatial distribution of sites possessing VLT permits. This is a significant link that public health officials must consider when elaborating programs to combat pathological gambling.

  9. Errors in retarding potential analyzers caused by nonuniformity of the grid-plane potential.

    NASA Technical Reports Server (NTRS)

    Hanson, W. B.; Frame, D. R.; Midgley, J. E.

    1972-01-01

    One aspect of the degradation in performance of retarding potential analyzers caused by potential depressions in the retarding grid is quantitatively estimated from laboratory measurements and theoretical calculations. A simple expression is obtained that permits the use of laboratory measurements of grid properties to make first-order corrections to flight data. Systematic positive errors in ion temperature of approximately 16% for the Ogo 4 instrument and 3% for the Ogo 6 instrument are deduced. The effects of the transverse electric fields arising from the grid potential depressions are not treated.

  10. The inventory for déjà vu experiences assessment. Development, utility, reliability, and validity.

    PubMed

    Sno, H N; Schalken, H F; de Jonghe, F; Koeter, M W

    1994-01-01

    In this article the development, utility, reliability, and validity of the Inventory for Déjà vu Experiences Assessment (IDEA) are described. The IDEA is a 23-item self-administered questionnaire consisting of a general section of nine questions and qualitative section of 14 questions. The latter questions comprise 48 topics. The questionnaire appeared to be a user-friendly instrument with satisfactory to good reliability and validity. The IDEA permits the study of quantitative and qualitative characteristics of déjà vu experiences.

  11. A university teaching simulation facility

    NASA Technical Reports Server (NTRS)

    Stark, Lawrence; Kim, Won-Soo; Tendick, Frank; Tyler, Mitchell; Hannaford, Blake; Barakat, Wissam; Bergengruen, Olaf; Braddi, Louis; Eisenberg, Joseph; Ellis, Stephen

    1987-01-01

    An experimental telerobotics (TR) simulation is described suitable for studying human operator (HO) performance. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. A number of control modes could be compared in this TR simulation, including displacement, rate, and acceleratory control using position and force joysticks. A homeomorphic controller turned out to be no better than joysticks; the adaptive properties of the HO can apparently permit quite good control over a variety of controller configurations and control modes. Training by optimal control example seemed helpful in preliminary experiments.

  12. Quantitative investigation of inappropriate regression model construction and the importance of medical statistics experts in observational medical research: a cross-sectional study.

    PubMed

    Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka

    2018-05-05

    To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. 3D MR flow analysis in realistic rapid-prototyping model systems of the thoracic aorta: comparison with in vivo data and computational fluid dynamics in identical vessel geometries.

    PubMed

    Canstein, C; Cachot, P; Faust, A; Stalder, A F; Bock, J; Frydrychowicz, A; Küffer, J; Hennig, J; Markl, M

    2008-03-01

    The knowledge of local vascular anatomy and function in the human body is of high interest for the diagnosis and treatment of cardiovascular disease. A comprehensive analysis of the hemodynamics in the thoracic aorta is presented based on the integration of flow-sensitive 4D MRI with state-of-the-art rapid prototyping technology and computational fluid dynamics (CFD). Rapid prototyping was used to transform aortic geometries as measured by contrast-enhanced MR angiography into realistic vascular models with large anatomical coverage. Integration into a flow circuit with patient-specific pulsatile in-flow conditions and application of flow-sensitive 4D MRI permitted detailed analysis of local and global 3D flow dynamics in a realistic vascular geometry. Visualization of characteristic 3D flow patterns and quantitative comparisons of the in vitro experiments with in vivo data and CFD simulations in identical vascular geometries were performed to evaluate the accuracy of vascular model systems. The results indicate the potential of such patient-specific model systems for detailed experimental simulation of realistic vascular hemodynamics. Further studies are warranted to examine the influence of refined boundary conditions of the human circulatory system such as fluid-wall interaction and their effect on normal and pathological blood flow characteristics associated with vascular geometry. (c) 2008 Wiley-Liss, Inc.

  14. A county-level cross-sectional analysis of positive deviance to assess multiple population health outcomes in Indiana.

    PubMed

    Hendryx, Michael; Guerra-Reyes, Lucia; Holland, Benjamin D; McGinnis, Michael Dean; Meanwell, Emily; Middlestadt, Susan E; Yoder, Karen M

    2017-10-11

    To test a positive deviance method to identify counties that are performing better than statistical expectations on a set of population health indicators. Quantitative, cross-sectional county-level secondary analysis of risk variables and outcomes in Indiana. Data are analysed using multiple linear regression to identify counties performing better or worse than expected given traditional risk indicators, with a focus on 'positive deviants' or counties performing better than expected. Counties in Indiana (n=92) constitute the unit of analysis. Per cent adult obesity, per cent fair/poor health, low birth weight per cent, per cent with diabetes, years of potential life lost, colorectal cancer incidence rate and circulatory disease mortality rate. County performance that outperforms expectations is for the most part outcome specific. But there are a few counties that performed particularly well across most measures. The positive deviance approach provides a means for state and local public health departments to identify places that show better health outcomes despite demographic, social, economic or behavioural disadvantage. These places may serve as case studies or models for subsequent investigations to uncover best practices in the face of adversity and generalise effective approaches to other areas. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    PubMed Central

    Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.

    2011-01-01

    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571

  16. Quantitative water quality with LANDSAT and Skylab

    NASA Technical Reports Server (NTRS)

    Yarger, H. L.; Mccauley, J. R.

    1975-01-01

    Correlation studies were completed between LANDSAT Multispectral Scanner (MSS) band ratios derived from computer compatible tape (CCT) and 170 water samples taken from three large Kansas reservoirs, coincident with 16 different LANDSAT passes over a 13 month period. The following conclusions were obtained: (1) LANDSAT MSS reflectance levels are useful for quantitative measurement of suspended solids up to at least 900 ppm, (2) MSS band ratios derived from CCT can measure suspended solids with 67% confidence level accuracy of 12 ppm over the range 0-80 ppm and 35 ppm over the range 0900 ppm, (3) suspended solids contour maps can be easily constructed from CCT for water bodies larger than approximately 100 acres, (4) rationing suppresses MSS reflectance level dependence on seasonal sun angle variation and permits measurement of suspended load the year round in the middle latitudes. SKYLAB imagery from a single pass over three reservoirs compares favorably to LANDSAT results up to 100 ppm.

  17. Multisite formative assessment for the Pathways study to prevent obesity in American Indian schoolchildren123

    PubMed Central

    Gittelsohn, Joel; Evans, Marguerite; Story, Mary; Davis, Sally M; Metcalfe, Lauve; Helitzer, Deborah L; Clay, Theresa E

    2016-01-01

    We describe the formative assessment process, using an approach based on social learning theory, for the development of a school-based obesity-prevention intervention into which cultural perspectives are integrated. The feasibility phase of the Pathways study was conducted in multiple settings in 6 American Indian nations. The Pathways formative assessment collected both qualitative and quantitative data. The qualitative data identified key social and environmental issues and enabled local people to express their own needs and views. The quantitative, structured data permitted comparison across sites. Both types of data were integrated by using a conceptual and procedural model. The formative assessment results were used to identify and rank the behavioral risk factors that were to become the focus of the Pathways intervention and to provide guidance on developing common intervention strategies that would be culturally appropriate and acceptable to all sites. PMID:10195601

  18. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  19. A simple way to unify multicriteria decision analysis (MCDA) and stochastic multicriteria acceptability analysis (SMAA) using a Dirichlet distribution in benefit-risk assessment.

    PubMed

    Saint-Hilary, Gaelle; Cadour, Stephanie; Robert, Veronique; Gasparini, Mauro

    2017-05-01

    Quantitative methodologies have been proposed to support decision making in drug development and monitoring. In particular, multicriteria decision analysis (MCDA) and stochastic multicriteria acceptability analysis (SMAA) are useful tools to assess the benefit-risk ratio of medicines according to the performances of the treatments on several criteria, accounting for the preferences of the decision makers regarding the relative importance of these criteria. However, even in its probabilistic form, MCDA requires the exact elicitations of the weights of the criteria by the decision makers, which may be difficult to achieve in practice. SMAA allows for more flexibility and can be used with unknown or partially known preferences, but it is less popular due to its increased complexity and the high degree of uncertainty in its results. In this paper, we propose a simple model as a generalization of MCDA and SMAA, by applying a Dirichlet distribution to the weights of the criteria and by making its parameters vary. This unique model permits to fit both MCDA and SMAA, and allows for a more extended exploration of the benefit-risk assessment of treatments. The precision of its results depends on the precision parameter of the Dirichlet distribution, which could be naturally interpreted as the strength of confidence of the decision makers in their elicitation of preferences. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. National citation patterns of NEJM, The Lancet, JAMA and The BMJ in the lay press: a quantitative content analysis.

    PubMed

    Casino, Gonzalo; Rius, Roser; Cobo, Erik

    2017-11-12

    To analyse the total number of newspaper articles citing the four leading general medical journals and to describe national citation patterns. Quantitative content analysis. Full text of 22 general newspapers in 14 countries over the period 2008-2015, collected from LexisNexis. The 14 countries have been categorised into four regions: the USA, the UK, Western World (European countries other than the UK, and Australia, New Zealand and Canada) and Rest of the World (other countries). Press citations of four medical journals (two American: NEJM and JAMA ; and two British: The Lancet and The BMJ ) in 22 newspapers. British and American newspapers cited some of the four analysed medical journals about three times a week in 2008-2015 (weekly mean 3.2 and 2.7 citations, respectively); the newspapers from other Western countries did so about once a week (weekly mean 1.1), and those from the Rest of the World cited them about once a month (monthly mean 1.1). The New York Times cited above all other newspapers (weekly mean 4.7). The analysis showed the existence of three national citation patterns in the daily press: American newspapers cited mostly American journals (70.0% of citations), British newspapers cited mostly British journals (86.5%) and the rest of the analysed press cited more British journals than American ones. The Lancet was the most cited journal in the press of almost all Western countries outside the USA and the UK. Multivariate correspondence analysis confirmed the national patterns and showed that over 85% of the citation data variability is retained in just one single new variable: the national dimension. British and American newspapers are the ones that cite the four analysed medical journals more often, showing a domestic preference for their respective national journals; non-British and non-American newspapers show a common international citation pattern. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Watershed-based point sources permitting strategy and dynamic permit-trading analysis.

    PubMed

    Ning, Shu-Kuang; Chang, Ni-Bin

    2007-09-01

    Permit-trading policy in a total maximum daily load (TMDL) program may provide an additional avenue to produce environmental benefit, which closely approximates what would be achieved through a command and control approach, with relatively lower costs. One of the important considerations that might affect the effective trading mechanism is to determine the dynamic transaction prices and trading ratios in response to seasonal changes of assimilative capacity in the river. Advanced studies associated with multi-temporal spatially varied trading ratios among point sources to manage water pollution hold considerable potential for industries and policy makers alike. This paper aims to present an integrated simulation and optimization analysis for generating spatially varied trading ratios and evaluating seasonal transaction prices accordingly. It is designed to configure a permit-trading structure basin-wide and provide decision makers with a wealth of cost-effective, technology-oriented, risk-informed, and community-based management strategies. The case study, seamlessly integrating a QUAL2E simulation model with an optimal waste load allocation (WLA) scheme in a designated TMDL study area, helps understand the complexity of varying environmental resources values over space and time. The pollutants of concern in this region, which are eligible for trading, mainly include both biochemical oxygen demand (BOD) and ammonia-nitrogen (NH3-N). The problem solution, as a consequence, suggests an array of waste load reduction targets in a well-defined WLA scheme and exhibits a dynamic permit-trading framework among different sub-watersheds in the study area. Research findings gained in this paper may extend to any transferable dynamic-discharge permit (TDDP) program worldwide.

  2. PIXE analysis on Maya blue in Prehispanic and colonial mural paintings

    NASA Astrophysics Data System (ADS)

    Sánchez del Río, M.; Martinetto, P.; Solís, C.; Reyes-Valerio, C.

    2006-08-01

    Particle induced X-ray emission (PIXE) experiments have been carried out at the AGLAE facility (Paris) on several mural samples containing Maya blue from different Prehispanic archaeological sites (Cacaxtla, El Tajín, Tamuin, Santa Cecilia Acatitlán) and from several colonial convents in the Mexican plateau (Jiutepec, Totimehuacán, Tezontepec and Cuauhtinchán). The analysis of the concentration of several elements permitted to extract some information on the technique used for painting the mural, usually fresco. Principal component analysis permitted to classify the samples into groups. This grouping is discussed in relation to geographic and historic data.

  3. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  4. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  5. An assessment of optical properties of dissolved organic material as quantitative source indicators in the Santa Ana River basin, Southern California

    USGS Publications Warehouse

    Bergamaschi, Brian A.; Kalve, Erica; Guenther, Larry; Mendez, Gregory O.; Belitz, Kenneth

    2005-01-01

    The ability to rapidly, reliably, and inexpensively characterize sources of dissolved organic material (DOM) in watersheds would allow water management agencies to more quickly identify problems in water sources, and to more efficiently allocate water resources by, for example, permitting real-time identification of high-quality water suitable for ground-water recharge, or poor-quality water in need of mitigation. This study examined the feasibility of using easily measurable intrinsic optical properties' absorbance and fluorescence spectra, as quantitative indicators of DOM sources and, thus, a predictor of water quality. The study focused on the Santa Ana River Basin, in southern California, USA, which comprises an area of dense urban development and an area of intense dairy production. Base flow in the Santa Ana Basin is primarily tertiary treated wastewater discharge. Available hydrologic data indicate that urban and agricultural runoff degrades water quality during storm events by introducing pathogens, nutrients, and other contaminants, including significant amounts of DOM. These conditions provide the basis for evaluating the use of DOM optical properties as a tracer of DOM from different sources. Sample spectra representing four principal DOM sources were identified among all samples collected in 1999 on the basis of basin hydrology, and the distribution of spectral variability within all the sample data. A linear mixing model provided quantitative estimates of relative endmember contribution to sample spectra for monthly, storm, and diurnal samples. The spectral properties of the four sources (endmembers), Pristine Water, Wastewater, Urban Water, and Dairy Water, accounted for 94 percent of the variability in optical properties observed in the study, suggesting that all important DOM sources were represented. The scale and distribution of the residual spectra, that not explained by the endmembers, suggested that the endmember spectra selected did not adequately represent Urban Water base flow. However, model assignments of sources generally agreed well with those expected, based on sampling location and hydrology. The results suggest that with a fuller characterization of the endmember spectra, analysis of optical properties will provide rapid quantitative estimates of the relative contribution of DOM sources in the Santa Ana Basin.

  6. The analysis of temperature effect and temperature compensation of MOEMS accelerometer based on a grating interferometric cavity

    NASA Astrophysics Data System (ADS)

    Han, Dandan; Bai, Jian; Lu, Qianbo; Lou, Shuqi; Jiao, Xufen; Yang, Guoguang

    2016-08-01

    There is a temperature drift of an accelerometer attributed to the temperature variation, which would adversely influence the output performance. In this paper, a quantitative analysis of the temperature effect and the temperature compensation of a MOEMS accelerometer, which is composed of a grating interferometric cavity and a micromachined sensing chip, are proposed. A finite-element-method (FEM) approach is applied in this work to simulate the deformation of the sensing chip of the MOEMS accelerometer at different temperature from -20°C to 70°C. The deformation results in the variation of the distance between the grating and the sensing chip of the MOEMS accelerometer, modulating the output intensities finally. A static temperature model is set up to describe the temperature characteristics of the accelerometer through the simulation results and the temperature compensation is put forward based on the temperature model, which can improve the output performance of the accelerometer. This model is permitted to estimate the temperature effect of this type accelerometer, which contains a micromachined sensing chip. Comparison of the output intensities with and without temperature compensation indicates that the temperature compensation can improve the stability of the output intensities of the MOEMS accelerometer based on a grating interferometric cavity.

  7. Trace-level screening of dichlorophenols in processed dairy milk by headspace gas chromatography.

    PubMed

    Gras, Kaelyn; Luong, Jim; Gras, Ronda; Shellie, Robert A

    2016-10-01

    A headspace gas chromatographic approach based on flame ionization detection has been successfully developed for the determination of parts-per-billion levels of 2,4-dichlorophenol and 2,6-dichlorophenol in processed dairy milk. Under the right environmental conditions, these compounds are produced as products of the reductive dechlorination of pentachlorophenol. Maintaining a highly inert chromatographic system and employing a recently commercialized inert capillary column permits the analysis of 2,4-dichlorophenol and 2,6-dichlorophenol without derivatization. Further, a detection limit improvement of more than a factor of two was achieved by adding sodium sulfate to substantially decrease the solute partition coefficient in the matrix. A detection limit of 1 ng/g and a limit of quantitation of 2 ng/g were attained, and complete analysis can be conducted in < 13 min. Reproducibility of area counts over a range from 20 to 200 ng/g and over a period of 2 days were found to be less than 6% (n = 20). A linear range from 5 to 500 ng/g with a correlation coefficient of at least 0.9992 was obtained for 2,4-dichlorophenol and 2,6-dichlorophenol. Spike recoveries from 10 to 500 ng/g for all the analytes range from 92 to 102%. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Elemental Impurities in Pharmaceutical Excipients.

    PubMed

    Li, Gang; Schoneker, Dave; Ulman, Katherine L; Sturm, Jason J; Thackery, Lisa M; Kauffman, John F

    2015-12-01

    Control of elemental impurities in pharmaceutical materials is currently undergoing a transition from control based on concentrations in components of drug products to control based on permitted daily exposures in drug products. Within the pharmaceutical community, there is uncertainty regarding the impact of these changes on manufactures of drug products. This uncertainty is fueled in part by a lack of publically available information on elemental impurity levels in common pharmaceutical excipients. This paper summarizes a recent survey of elemental impurity levels in common pharmaceutical excipients as well as some drug substances. A widely applicable analytical procedure was developed and was shown to be suitable for analysis of elements that are subject to United States Pharmacopoeia Chapter <232> and International Conference on Harmonization's Q3D Guideline on Elemental Impurities. The procedure utilizes microwave-assisted digestion of pharmaceutical materials and inductively coupled plasma mass spectrometry for quantitative analysis of these elements. The procedure was applied to 190 samples from 31 different excipients and 15 samples from eight drug substances provided through the International Pharmaceutical Excipient Council of the Americas. The results of the survey indicate that, for the materials included in the study, relatively low levels of elemental impurities are present. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. Integrative Functional Genomics for Systems Genetics in GeneWeaver.org.

    PubMed

    Bubier, Jason A; Langston, Michael A; Baker, Erich J; Chesler, Elissa J

    2017-01-01

    The abundance of existing functional genomics studies permits an integrative approach to interpreting and resolving the results of diverse systems genetics studies. However, a major challenge lies in assembling and harmonizing heterogeneous data sets across species for facile comparison to the positional candidate genes and coexpression networks that come from systems genetic studies. GeneWeaver is an online database and suite of tools at www.geneweaver.org that allows for fast aggregation and analysis of gene set-centric data. GeneWeaver contains curated experimental data together with resource-level data such as GO annotations, MP annotations, and KEGG pathways, along with persistent stores of user entered data sets. These can be entered directly into GeneWeaver or transferred from widely used resources such as GeneNetwork.org. Data are analyzed using statistical tools and advanced graph algorithms to discover new relations, prioritize candidate genes, and generate function hypotheses. Here we use GeneWeaver to find genes common to multiple gene sets, prioritize candidate genes from a quantitative trait locus, and characterize a set of differentially expressed genes. Coupling a large multispecies repository curated and empirical functional genomics data to fast computational tools allows for the rapid integrative analysis of heterogeneous data for interpreting and extrapolating systems genetics results.

  10. Hydrodynamic Properties of Planing Surfaces and Flying Boats

    NASA Technical Reports Server (NTRS)

    Sokolov, N. A.

    1950-01-01

    The study of the hydrodynamic properties of planing bottom of flying boats and seaplane floats is at the present time based exclusively on the curves of towing tests conducted in tanks. In order to provide a rational basis for the test procedure in tanks and practical design data, a theoretical study must be made of the flow at the step and relations derived that show not only qualitatively but quantitatively the inter-relations of the various factors involved. The general solution of the problem of the development of hydrodynamic forces during the motion of the seaplane float or flying boat is very difficult for it is necessary to give a three-dimensional solution, which does not always permit reducing the analysis to the form of workable computation formulas. On the other had, the problem is complicated by the fact that the object of the analysis is concerned with two fluid mediums, namely, air and water, which have a surface of density discontinuity between them. The theoretical and experimental investigations on the hydrodynamics of a ship cannot be completely carried over to the design of floats and flying-boat hulls, because of the difference in the shape of the contour lines of the bodies, and, because of the entirely different flow conditions from the hydrodynamic viewpoint.

  11. Detection of inflammatory cytokines using a fiber optic microsphere immunoassay array

    NASA Astrophysics Data System (ADS)

    Blicharz, Timothy M.; Walt, David R.

    2006-10-01

    A multiplexed fiber optic microsphere-based immunoassay array capable of simultaneously measuring five inflammatory cytokines has been developed. Five groups of amine-functionalized 3.1 micron microspheres were internally encoded with five distinct concentrations of a europium dye and converted to cytokine probes by covalently coupling monoclonal capture antibodies specific for human VEGF, IFN-gamma, RANTES, IP-10, and Eotaxin-3 to the microspheres via glutaraldehyde chemistry. The microspheres were pooled and loaded into a 1 mm diameter fiber optic bundle containing ~50,000 individual etched microwells, producing the multiplexed cytokine immunoassay array. Multiple arrays can be created from a single microsphere pool for high throughput sample analysis. Sandwich fluoroimmunoassays were performed by incubating the probe array in a sample, followed by incubation in a mixture of biotin-labeled detection antibodies that are complementary to the five cytokines. Finally, universal detection of each protein was performed using a fluorescence imaging system after briefly immersing the array in a solution of fluorophore-labeled streptavidin. The multiplexed cytokine array has been shown to respond selectively to VEGF, IFNgamma, RANTES, IP-10, and Eotaxin-3, permitting multiplexed quantitative analysis. Ultimately, the multiplexed cytokine array will be utilized to evaluate the potential of using saliva as a noninvasive diagnostic fluid for pulmonary inflammatory diseases such as asthma.

  12. rpb2 is a reliable reference gene for quantitative gene expression analysis in the dermatophyte Trichophyton rubrum.

    PubMed

    Jacob, Tiago R; Peres, Nalu T A; Persinoti, Gabriela F; Silva, Larissa G; Mazucato, Mendelson; Rossi, Antonio; Martinez-Rossi, Nilce M

    2012-05-01

    The selection of reference genes used for data normalization to quantify gene expression by real-time PCR amplifications (qRT-PCR) is crucial for the accuracy of this technique. In spite of this, little information regarding such genes for qRT-PCR is available for gene expression analyses in pathogenic fungi. Thus, we investigated the suitability of eight candidate reference genes in isolates of the human dermatophyte Trichophyton rubrum subjected to several environmental challenges, such as drug exposure, interaction with human nail and skin, and heat stress. The stability of these genes was determined by geNorm, NormFinder and Best-Keeper programs. The gene with the most stable expression in the majority of the conditions tested was rpb2 (DNA-dependent RNA polymerase II), which was validated in three T. rubrum strains. Moreover, the combination of rpb2 and chs1 (chitin synthase) genes provided for the most reliable qRT-PCR data normalization in T. rubrum under a broad range of biological conditions. To the best of our knowledge this is the first report on the selection of reference genes for qRT-PCR data normalization in dermatophytes and the results of these studies should permit further analysis of gene expression under several experimental conditions, with improved accuracy and reliability.

  13. Market power, private information and the optimal scale of pollution permit markets with application to North Carolina’s Neuse River

    USDA-ARS?s Scientific Manuscript database

    We extend the analysis of optimal scale in pollution permit markets by allowing for both market power and private information. The effect of these considerations on optimal scale is determined by analyzing pollution of nitrogen from Waste Water Treatment Plants (WWTP) into North Carolina’s Neuse Riv...

  14. 76 FR 54163 - Proximity Detection Systems for Continuous Mining Machines in Underground Coal Mines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... analysis of fatalities and non-fatal accidents during the 1984 through 2010 period indicates that many of... under 30 CFR 18.82 and issued an experimental permit on May 30, 2003. After several revisions, the... Geosteering Tramguard TM System, which MSHA tested in June 2005 under an experimental permit on a remote...

  15. 78 FR 11628 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Snapper-Grouper Fishery of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Administration (NOAA), Commerce. ACTION: Notice of receipt of an application for an exempted fishing permit; request for comments. SUMMARY: NMFS announces the receipt of an application for an exempted fishing permit... collected from the fish for genetic analysis, and age and growth studies. Additional information on the...

  16. M113A1 Day/Night Movement Rate Analysis

    DTIC Science & Technology

    1975-06-01

    d. Each vehicle traversed the course usiig the path of the preced- ing vehicle, i.e., no " free play " in selecting a route was permitted. The test...traversed the course usling the path of the preced- ing vehicle, i.e., no " free play " in selecting a route was permitted. The test course varied from

  17. 76 FR 8997 - Notice of Decision To Issue Permits for the Importation of Fresh Strawberries From Jordan Into...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ...] Notice of Decision To Issue Permits for the Importation of Fresh Strawberries From Jordan Into the... continental United States of fresh strawberries from Jordan. Based on the findings of a pest risk analysis... strawberries from Jordan. DATES: Effective Date: February 16, 2011. FOR FURTHER INFORMATION CONTACT: Ms. Donna...

  18. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  19. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  20. Numerical simulations for quantitative analysis of electrostatic interaction between atomic force microscopy probe and an embedded electrode within a thin dielectric: meshing optimization, sensitivity to potential distribution and impact of cantilever contribution

    NASA Astrophysics Data System (ADS)

    Azib, M.; Baudoin, F.; Binaud, N.; Villeneuve-Faure, C.; Bugarin, F.; Segonds, S.; Teyssedre, G.

    2018-04-01

    Recent experimental results demonstrated that an electrostatic force distance curve (EFDC) can be used for space charge probing in thin dielectric layers. A main advantage of the method is claimed to be its sensitivity to charge localization, which, however, needs to be substantiated by numerical simulations. In this paper, we have developed a model which permits us to compute an EFDC accurately by using the most sophisticated and accurate geometry for the atomic force microscopy probe. To avoid simplifications and in order to reproduce experimental conditions, the EFDC has been simulated for a system constituted of a polarized electrode embedded in a thin dielectric layer (SiN x ). The individual contributions of forces on the tip and on the cantilever have been analyzed separately to account for possible artefacts. The EFDC sensitivity to potential distribution is studied through the change in electrode shape, namely the width and the depth. Finally, the numerical results have been compared with experimental data.

  1. Analysis of Protein Interactions with Picomolar Binding Affinity by Fluorescence-Detected Sedimentation Velocity

    PubMed Central

    2014-01-01

    The study of high-affinity protein interactions with equilibrium dissociation constants (KD) in the picomolar range is of significant interest in many fields, but the characterization of stoichiometry and free energy of such high-affinity binding can be far from trivial. Analytical ultracentrifugation has long been considered a gold standard in the study of protein interactions but is typically applied to systems with micromolar KD. Here we present a new approach for the study of high-affinity interactions using fluorescence detected sedimentation velocity analytical ultracentrifugation (FDS-SV). Taking full advantage of the large data sets in FDS-SV by direct boundary modeling with sedimentation coefficient distributions c(s), we demonstrate detection and hydrodynamic resolution of protein complexes at low picomolar concentrations. We show how this permits the characterization of the antibody–antigen interactions with low picomolar binding constants, 2 orders of magnitude lower than previously achieved. The strongly size-dependent separation and quantitation by concentration, size, and shape of free and complex species in free solution by FDS-SV has significant potential for studying high-affinity multistep and multicomponent protein assemblies. PMID:24552356

  2. Signal improvement in multiphoton microscopy by reflection with simple mirrors near the sample

    NASA Astrophysics Data System (ADS)

    Rehberg, Markus; Krombach, Fritz; Pohl, Ulrich; Dietzel, Steffen

    2010-03-01

    In conventional fluorescence or confocal microscopy, emitted light is generated not only in the focal plane but also above and below. The situation is different in multiphoton-induced fluorescence and multiphoton-induced higher harmonic generation. Here, restriction of signal generation to a single focal point permits that all emitted photons can contribute to image formation if collected, regardless of their path through the specimen. Often, the intensity of the emitted light is rather low in biological specimens. We present a method to significantly increase the fraction of photons collected by an epi (backward) detector by placing a simple mirror, an aluminum-coated coverslip, directly under the sample. Samples investigated include fluorescent test slides, collagen gels, and thin-layered, intact mouse skeletal muscles. Quantitative analysis revealed an intensity increase of second- and third-harmonic generated signal in skeletal muscle of nine- and sevenfold respectively, and of fluorescent signal in test slides of up to twofold. Our approach thus allows significant signal improvement also for situations were a forward detection is impossible, e.g., due to the anatomy of animals in intravital microscopy.

  3. Monitoring nekton as a bioindicator in shallow estuarine habitats

    USGS Publications Warehouse

    Raposa, K.B.; Roman, C.T.; Heltshe, J.F.

    2003-01-01

    Long-term monitoring of estuarine nekton has many practical and ecological benefits but efforts are hampered by a lack of standardized sampling procedures. This study provides a rationale for monitoring nekton in shallow (< 1 m), temperate, estuarine habitats and addresses some important issues that arise when developing monitoring protocols. Sampling in seagrass and salt marsh habitats is emphasized due to the susceptibility of each habitat to anthropogenic stress and to the abundant and rich nekton assemblages that each habitat supports. Extensive sampling with quantitative enclosure traps that estimate nekton density is suggested. These gears have a high capture efficiency in most habitats and are small enough (e.g., 1 m(2)) to permit sampling in specific microhabitats. Other aspects of nekton monitoring are discussed, including spatial and temporal sampling considerations, station selection, sample size estimation, and data collection and analysis. Developing and initiating long-term nekton monitoring programs will help evaluate natural and human-induced changes in estuarine nekton over time and advance our understanding of the interactions between nekton and the dynamic estuarine environment.

  4. Development of an ultra high performance liquid chromatography method for determining triamcinolone acetonide in hydrogels using the design of experiments/design space strategy in combination with process capability index.

    PubMed

    Oliva, Alexis; Monzón, Cecilia; Santoveña, Ana; Fariña, José B; Llabrés, Matías

    2016-07-01

    An ultra high performance liquid chromatography method was developed and validated for the quantitation of triamcinolone acetonide in an injectable ophthalmic hydrogel to determine the contribution of analytical method error in the content uniformity measurement. During the development phase, the design of experiments/design space strategy was used. For this, the free R-program was used as a commercial software alternative, a fast efficient tool for data analysis. The process capability index was used to find the permitted level of variation for each factor and to define the design space. All these aspects were analyzed and discussed under different experimental conditions by the Monte Carlo simulation method. Second, a pre-study validation procedure was performed in accordance with the International Conference on Harmonization guidelines. The validated method was applied for the determination of uniformity of dosage units and the reasons for variability (inhomogeneity and the analytical method error) were analyzed based on the overall uncertainty. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. On-orbit performance of the Landsat 8 Operational Land Imager

    USGS Publications Warehouse

    Micijevic, Esad; Vanderwerff, Kelly; Scaramuzza, Pat; Morfitt, Ron; Barsi, Julia A.; Levy, Raviv

    2014-01-01

    The Landsat 8 satellite was launched on February 11, 2013, to systematically collect multispectral images for detection and quantitative analysis of changes on the Earth’s surface. The collected data are stored at the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and continue the longest archive of medium resolution Earth images. There are two imaging instruments onboard the satellite: the Operational Land Imager (OLI) and the Thermal InfraRed Sensor (TIRS). This paper summarizes radiometric performance of the OLI including the bias stability, the system noise, saturation and other artifacts observed in its data during the first 1.5 years on orbit. Detector noise levels remain low and Signal-To-Noise Ratio high, largely exceeding the requirements. Impulse noise and saturation are present in imagery, but have negligible effect on Landsat 8 products. Oversaturation happens occasionally, but the affected detectors quickly restore their nominal responsivity. Overall, the OLI performs very well on orbit and provides high quality products to the user community. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  6. Study of the surfactant role in latex-aerogel systems by scanning transmission electron microscopy on aqueous suspensions.

    PubMed

    Perret, A; Foray, G; Masenelli-Varlot, K; Maire, E; Yrieix, B

    2018-01-01

    For insulation applications, boards thinner than 2 cm are under design with specific thermal conductivities lower than 15 mW m -1  K -1 . This requires binding slightly hydrophobic aerogels which are highly nanoporous granular materials. To reach this step and ensure insulation board durability at the building scale, it is compulsory to design, characterise and analyse the microstructure at the nanoscale. It is indeed necessary to understand how the solid material is formed from a liquid suspension. This issue is addressed in this paper through wet-STEM experiments carried out in an Environmental Scanning Electron Microscope (ESEM). Latex-surfactant binary blends and latex-surfactant-aerogel ternary systems are studied, with two different surfactants of very different chemical structures. Image analysis is used to distinguish the different components and get quantitative morphological parameters which describe the sample architecture. The evolution of such morphological parameters during water evaporation permits a good understanding of the role of the surfactant. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  7. Mass Defect Labeling of Cysteine for Improving Peptide Assignment in Shotgun Proteomic Analyses

    PubMed Central

    Hernandez, Hilda; Niehauser, Sarah; Boltz, Stacey A.; Gawandi, Vijay; Phillips, Robert S.; Amster, I. Jonathan

    2006-01-01

    A method for improving the identification of peptides in a shotgun proteome analysis using accurate mass measurement has been developed. The improvement is based upon the derivatization of cysteine residues with a novel reagent, 2,4-dibromo-(2′-iodo)acetanilide. The derivitization changes the mass defect of cysteine-containing proteolytic peptides in a manner that increases their identification specificity. Peptide masses were measured using matrix-assisted laser desorption/ionization Fourier transform ion cyclotron mass spectrometry. Reactions with protein standards show that the derivatization of cysteine is rapid and quantitative, and the data suggest that the derivatized peptides are more easily ionized or detected than unlabeled cysteine-containing peptides. The reagent was tested on a 15N-metabolically labeled proteome from M. maripaludis. Proteins were identified by their accurate mass values and from their nitrogen stoichiometry. A total of 47% of the labeled peptides are identified versus 27% for the unlabeled peptides. This procedure permits the identification of proteins from the M. maripaludis proteome that are not usually observed by the standard protocol and shows that better protein coverage is obtained with this methodology. PMID:16689545

  8. Anharmonic Effects on Vibrational Spectra Intensities: Infrared, Raman, Vibrational Circular Dichroism and Raman Optical Activity

    PubMed Central

    Bloino, Julien; Biczysko, Malgorzata; Barone, Vincenzo

    2017-01-01

    The aim of this paper is twofold. First, we want to report the extension of our virtual multifrequency spectrometer (VMS) to anharmonic intensities for Raman Optical Activity (ROA) with the full inclusion of first- and second-order resonances for both frequencies and intensities in the framework of the generalized second-order vibrational perturbation theory (GVPT2) for all kinds of vibrational spectroscopies. Then, from a more general point of view, we want to present and validate the performance of VMS for the parallel analysis of different vibrational spectra for medium-sized molecules (IR, Raman, VCD, ROA) including both mechanical and electric/magnetic anharmonicity. For the well-known methyloxirane benchmark, careful selection of density functional, basis set, and resonance tresholds permitted to reach qualitative and quantitative vis-à-vis comparison between experimental and computed band positions and shapes. Next, the whole series of halogenated azetidinones is analyzed, showing that it is now possible to interpret different spectra in terms of electronegativity, polarizability, and hindrance variation between closely related substituents, chiral spectroscopies being particular effective in this connection. PMID:26580121

  9. Fiber estimation and tractography in diffusion MRI: Development of simulated brain images and comparison of multi-fiber analysis methods at clinical b-values

    PubMed Central

    Wilkins, Bryce; Lee, Namgyun; Gajawelli, Niharika; Law, Meng; Leporé, Natasha

    2015-01-01

    Advances in diffusion-weighted magnetic resonance imaging (DW-MRI) have led to many alternative diffusion sampling strategies and analysis methodologies. A common objective among methods is estimation of white matter fiber orientations within each voxel, as doing so permits in-vivo fiber-tracking and the ability to study brain connectivity and networks. Knowledge of how DW-MRI sampling schemes affect fiber estimation accuracy, and consequently tractography and the ability to recover complex white-matter pathways, as well as differences between results due to choice of analysis method and which method(s) perform optimally for specific data sets, all remain important problems, especially as tractography-based studies become common. In this work we begin to address these concerns by developing sets of simulated diffusion-weighted brain images which we then use to quantitatively evaluate the performance of six DW-MRI analysis methods in terms of estimated fiber orientation accuracy, false-positive (spurious) and false-negative (missing) fiber rates, and fiber-tracking. The analysis methods studied are: 1) a two-compartment “ball and stick” model (BSM) (Behrens et al., 2003); 2) a non-negativity constrained spherical deconvolution (CSD) approach (Tournier et al., 2007); 3) analytical q-ball imaging (QBI) (Descoteaux et al., 2007); 4) q-ball imaging with Funk-Radon and Cosine Transform (FRACT) (Haldar and Leahy, 2013); 5) q-ball imaging within constant solid angle (CSA) (Aganj et al., 2010); and 6) a generalized Fourier transform approach known as generalized q-sampling imaging (GQI) (Yeh et al., 2010). We investigate these methods using 20, 30, 40, 60, 90 and 120 evenly distributed q-space samples of a single shell, and focus on a signal-to-noise ratio (SNR = 18) and diffusion-weighting (b = 1000 s/mm2) common to clinical studies. We found the BSM and CSD methods consistently yielded the least fiber orientation error and simultaneously greatest detection rate of fibers. Fiber detection rate was found to be the most distinguishing characteristic between the methods, and a significant factor for complete recovery of tractography through complex white-matter pathways. For example, while all methods recovered similar tractography of prominent white matter pathways of limited fiber crossing, CSD (which had the highest fiber detection rate, especially for voxels containing three fibers) recovered the greatest number of fibers and largest fraction of correct tractography for a complex three-fiber crossing region. The synthetic data sets, ground-truth, and tools for quantitative evaluation are publically available on the NITRC website as the project “Simulated DW-MRI Brain Data Sets for Quantitative Evaluation of Estimated Fiber Orientations” at http://www.nitrc.org/projects/sim_dwi_brain PMID:25555998

  10. Fiber estimation and tractography in diffusion MRI: development of simulated brain images and comparison of multi-fiber analysis methods at clinical b-values.

    PubMed

    Wilkins, Bryce; Lee, Namgyun; Gajawelli, Niharika; Law, Meng; Leporé, Natasha

    2015-04-01

    Advances in diffusion-weighted magnetic resonance imaging (DW-MRI) have led to many alternative diffusion sampling strategies and analysis methodologies. A common objective among methods is estimation of white matter fiber orientations within each voxel, as doing so permits in-vivo fiber-tracking and the ability to study brain connectivity and networks. Knowledge of how DW-MRI sampling schemes affect fiber estimation accuracy, tractography and the ability to recover complex white-matter pathways, differences between results due to choice of analysis method, and which method(s) perform optimally for specific data sets, all remain important problems, especially as tractography-based studies become common. In this work, we begin to address these concerns by developing sets of simulated diffusion-weighted brain images which we then use to quantitatively evaluate the performance of six DW-MRI analysis methods in terms of estimated fiber orientation accuracy, false-positive (spurious) and false-negative (missing) fiber rates, and fiber-tracking. The analysis methods studied are: 1) a two-compartment "ball and stick" model (BSM) (Behrens et al., 2003); 2) a non-negativity constrained spherical deconvolution (CSD) approach (Tournier et al., 2007); 3) analytical q-ball imaging (QBI) (Descoteaux et al., 2007); 4) q-ball imaging with Funk-Radon and Cosine Transform (FRACT) (Haldar and Leahy, 2013); 5) q-ball imaging within constant solid angle (CSA) (Aganj et al., 2010); and 6) a generalized Fourier transform approach known as generalized q-sampling imaging (GQI) (Yeh et al., 2010). We investigate these methods using 20, 30, 40, 60, 90 and 120 evenly distributed q-space samples of a single shell, and focus on a signal-to-noise ratio (SNR = 18) and diffusion-weighting (b = 1000 s/mm(2)) common to clinical studies. We found that the BSM and CSD methods consistently yielded the least fiber orientation error and simultaneously greatest detection rate of fibers. Fiber detection rate was found to be the most distinguishing characteristic between the methods, and a significant factor for complete recovery of tractography through complex white-matter pathways. For example, while all methods recovered similar tractography of prominent white matter pathways of limited fiber crossing, CSD (which had the highest fiber detection rate, especially for voxels containing three fibers) recovered the greatest number of fibers and largest fraction of correct tractography for complex three-fiber crossing regions. The synthetic data sets, ground-truth, and tools for quantitative evaluation are publically available on the NITRC website as the project "Simulated DW-MRI Brain Data Sets for Quantitative Evaluation of Estimated Fiber Orientations" at http://www.nitrc.org/projects/sim_dwi_brain. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Anxiety, depression, and health-related quality of life in heterozygous familial hypercholesterolemia: A systematic review and meta-analysis.

    PubMed

    Akioyamen, Leo E; Genest, Jacques; Shan, Shubham D; Inibhunu, Happy; Chu, Anna; Tu, Jack V

    2018-06-01

    Heterozygous familial hypercholesterolemia (FH) is a common genetic disease predisposing affected individuals to a high risk of cardiovascular disease. Yet, considerable uncertainty exists regarding its impact on psychosocial wellbeing. We performed a systematic review and meta-analysis of the association between FH and symptoms of anxiety and depression, and health-related quality of life (HRQL). We searched MEDLINE, EMBASE, Global Health, the Cochrane Library, PsycINFO, and PubMed for peer-reviewed literature published in English between January 1, 1990 and January 1, 2018. Quantitative and qualitative studies were eligible if they included patients with confirmed FH and evaluated its association with symptoms of anxiety or depression, or HRQL. We performed a narrative synthesis of studies, including thematic analysis of qualitative studies, and where data permitted, random-effects meta-analysis reporting standardized mean differences (SMD) and 95% confidence intervals. We found 10 eligible studies measuring HRQL, depression and anxiety. Random-effects meta-analysis of 4 (n = 4293) and 5 studies (n = 5098), respectively, showed that patients with FH had slightly lower symptoms of anxiety (SMD: -0.29 [95% CI: -0.53, -0.04]) and mental HRQL (SMD: -0.10 [95% -0.20, -0.00]) relative to general population controls. No significant differences existed in depressive symptoms (SMD: 0.04 [95% CI: -0.12, 0.19]) or physical HRQL scores (SMD: 0.02 [95% CI: -0.09, 0.12]). Our systematic review suggests that patients with FH may report small but measurable differences in anxiety symptoms and mental HRQL. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    PubMed Central

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  13. Development of High Speed Imaging and Analysis Techniques Compressible Dynamics Stall

    NASA Technical Reports Server (NTRS)

    Chandrasekhara, M. S.; Carr, L. W.; Wilder, M. C.; Davis, Sanford S. (Technical Monitor)

    1996-01-01

    Dynamic stall has limited the flight envelope of helicopters for many years. The problem has been studied in the laboratory as well as in flight, but most research, even in the laboratory, has been restricted to surface measurement techniques such as pressure transducers or skin friction gauges, except at low speed. From this research, it became apparent that flow visualization tests performed at Mach numbers representing actual flight conditions were needed if the complex physics associated with dynamic stall was to be properly understood. However, visualization of the flow field during compressible conditions required carefully aligned and meticulously reconstructed holographic interferometry. As part of a long-range effort focused on exposing of the physics of compressible dynamic stall, a research wind tunnel was developed at NASA Ames Research Center which permits visual access to the full flow field surrounding an oscillating airfoil during compressible dynamic stall. Initially, a stroboscopic schlieren technique was used for visualization of the stall process, but the primary research tool has been point diffraction interferometry(PDI), a technique carefully optimized for use in th is project. A review of the process of development of PDI will be presented in the full paper. One of the most valuable aspects of PDI is the fact that interferograms are produced in real time on a continuous basis. The use of a rapidly-pulsed laser makes this practical; a discussion of this approach will be presented in the full paper. This rapid pulsing(up to 40,000 pulses/sec) produces interferograms of the rapidly developing dynamic stall field in sufficient resolution(both in space and time) that the fluid physics of the compressible dynamic stall flowfield can be quantitatively determined, including the gradients of pressure in space and time. This permits analysis of the influence of the effect of pitch rate, Mach number, Reynolds number, amplitude of oscillation, and other parameters on the dynamic stall process. When interferograms can be captured in real time, the potential for real-time mapping of a developing unsteady flow such as dynamic stall becomes a possibility. This has been achieved in the present case through the use of a high-speed drum camera combined with electronic circuitry which has resulted in a series of interferograms obtained during a single cycle of dynamic stall; images obtained at the rate of 20 KHz will be presented as a part of the formal presentation. Interferometry has been available for a long time; however, most of its use has been limited to visualization. The present research has focused on use of interferograms for quantitative mapping of the flow over oscillating airfoils. Instantaneous pressure distributions can now be obtained semi-automatically, making practical the analysis of the thousands of interferograms that are produced in this research. A review of the techniques that have been developed as part of this research effort will be presented in the final paper.

  14. A conceptual analysis of the application of tradable permits to biodiversity conservation.

    PubMed

    Wissel, Silvia; Wätzold, Frank

    2010-04-01

    Tradable permits have been applied in many areas of environmental policy and may be a response to increasing calls for flexible conservation instruments that successfully conserve biodiversity while allowing for economic development. The idea behind applying tradable permits to conservation is that developers wishing to turn land to economic purposes, thereby destroying valuable habitat, may only do so if they submit a permit to the conservation agency showing that habitat of at least the equivalent ecological value is restored elsewhere. The developer himself does not need to carry out the restoration, but may buy a permit from a third party, thus allowing a market to emerge. Nevertheless, the application of tradable permits to biodiversity conservation is a complex issue because destroyed and restored habitats are likely to differ. There may be various trade-offs between the ecological requirements that destroyed and restored habitats be as similar as possible, and the need for a certain level of market activity to have a functioning trading system. The success of tradable permits as an instrument for reconciling the conflicts between economic development and conservation depends on the existence of certain economic, institutional, and ecological preconditions, for example, a functioning institutional framework, sufficient expert knowledge, and adequate monitoring and enforcement mechanisms.

  15. Spatial Conflict of Mining Land in Tolitoli District -Province of Central Sulawesi

    NASA Astrophysics Data System (ADS)

    Suwarno, Y.; Windiastuti, R.

    2018-05-01

    Spatial planning is supposed to be applied in the use of space, so there will be no overlapping space utilization. In fact, there are still overlapping uses of land, between the area of mining and plantation, as well as with forest areas. The purpose of this study was to find out the conflicts that occured due to overlapping permits given to mining and plantation companies, and also to forest status. The method used was by overlaying the maps of Mining Business Permit with that of Plantation Business Permit, and also with Forest Area Map. In Tolitoli District there were 23 mining business permit holders with 7 types of mining commodities, covering total areaof 81,503.54 Hectare. In addition, there were 5 companies holding plantation business permits, mostly on palm oil, and only 2 companies with rubber and sengon wood business commodities, with a total area of 80,005.35 Hectare. From the result of spatial analysis, it was found that there was an overlapping area of 22,869.70 Hectare, while the area of 118,072.93 Hectare did not overlap. The Mining Business Permit overlapped with the Plantation Business Permit covering an area of 18,853.32 Hectare, and 4,301.77 Hectare were located in Forest Protected Area and Nature Reserve.

  16. Multiple Interactive Pollutants in Water Quality Trading

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Lence, Barbara J.; Shamsai, Abolfazl

    2008-10-01

    Efficient environmental management calls for the consideration of multiple pollutants, for which two main types of transferable discharge permit (TDP) program have been described: separate permits that manage each pollutant individually in separate markets, with each permit based on the quantity of the pollutant or its environmental effects, and weighted-sum permits that aggregate several pollutants as a single commodity to be traded in a single market. In this paper, we perform a mathematical analysis of TDP programs for multiple pollutants that jointly affect the environment (i.e., interactive pollutants) and demonstrate the practicality of this approach for cost-efficient maintenance of river water quality. For interactive pollutants, the relative weighting factors are functions of the water quality impacts, marginal damage function, and marginal treatment costs at optimality. We derive the optimal set of weighting factors required by this approach for important scenarios for multiple interactive pollutants and propose using an analytical elasticity of substitution function to estimate damage functions for these scenarios. We evaluate the applicability of this approach using a hypothetical example that considers two interactive pollutants. We compare the weighted-sum permit approach for interactive pollutants with individual permit systems and TDP programs for multiple additive pollutants. We conclude by discussing practical considerations and implementation issues that result from the application of weighted-sum permit programs.

  17. How to use sequence analysis for life course epidemiology? An example on HIV-positive Sub-Saharan migrants in France.

    PubMed

    Gosselin, Anne; Desgrées du Loû, Annabel; Lelièvre, Eva

    2018-06-01

    Life course epidemiology is now an established field in social epidemiology; in sociodemography, the quantitative analysis of biographies recently experienced significant trend from event history analysis to sequence analysis. The purpose of this article is to introduce and adapt this methodology to a social epidemiology question, taking the example of the impact of HIV diagnosis on Sub-Saharan migrants' residential trajectories in the Paris region. The sample consists of 640 migrants born in Sub-Saharan Africa receiving HIV care. They were interviewed in healthcare facilities in the Paris region within the PARCOURS project, conducted from 2012 to 2013, using life event history calendars, which recorded year by year their health, family and residential histories. We introduce a two-step methodological approach consisting of (1) sequence analysis by optimal matching to build a typology of migrants' residential pathways before and after diagnosis, and (2) a Cox model of the probability to experience changes in the residential situation. The seven-clusters typology shows that for a majority, the HIV diagnosis did not entail changes in residential situation. However 30% of the migrants experienced a change in their residential situation at time of diagnosis. The Cox model analysis reveals that this residential change was in fact moving in with one's partner (HR 2.99, P<0.000) rather than network rejection. This original combination of sequence analysis and Cox models is a powerful process that could be applied to other themes and constitutes a new approach in the life course epidemiology toolbox. NCT02566148. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. A low-cost gradient system for high-performance liquid chromatography. Quantitation of complex pharmaceutical raw materials.

    PubMed

    Erni, F; Frei, R W

    1976-09-29

    A device is described that makes use of an eight-port motor valve to generate step gradients on the low-pressure side of a piston pump with a low dead volume. Such a gradient device with an automatic control unit, which also permits repetition of previous steps, can be built for about half the cost of a gradient system with two pumps. Applications of this gradient unit to the separation of complex mixtures of glycosides and alkaloids are discussed and compared with separations systems using two high-pressure pumps. The gradients that are used on reversed-phase material with solvent mixtures of water and completely miscible organic solvents are suitable for quantitative routine control of pharmaceutical products. The reproducibility of retention data is excellent over several months and, with the use of loop injectors, major components can be determined quantitatively with a reproducibility of better than 2% (relative standard deviation). The step gradient selector valve can also be used as an introduction system for very large sample volumes. Up to 11 can be injected and samples with concentrations of less than 1 ppb can be determined with good reproducibilities.

  19. A risk/cost analysis of alternative screening intervals for occupational tuberculosis infection.

    PubMed

    Nicas, M

    1998-02-01

    The Centers for Disease Control and Prevention (CDC) recommends that new health care employees receive a baseline skin test for Mycobacterium tuberculosis (M. tb) infection and that testing be repeated periodically. However, CDC does not explain the quantitative basis for its suggested screening intervals. This article examines the efficacy of alternative screening intervals for workers subject to different annual rates of M. tb infection and estimates the costs. An equation is developed for the cumulative risk of tuberculosis (TB) at 12 years given a specified annual rate of infection (ARI), screening interval, and a combined proportion (p) of successful skin testing and antibiotic prophylaxis. Equations for total cost of screening and cost per disease case prevented are provided. Results assume: (a) costs of $10 per skin test and $10,000 per TB disease case; (b) p = 0.88; and (c) and acceptable cumulative TB risk of 1 per 1000. For ARIs that might be deemed low (0.2% to 0.5%) and medium (1%), CDC screening intervals of 12 months and 6-12 months, respectively, minimize the cost per disease case prevented but permit residual disease risks greater than 1 per 1000. Recommended screening intervals are (i) 6 months for low-risk employee groups and (ii) 3 months for medium- and high-risk (e.g., ARIs of > or = 5%) groups. Interval (i) limits risk to 1 per 1000 and is approximately 50% shorter than the CDC interval for a low-risk group. Interval (ii), which is 67% shorter than the CDC interval for medium-risk groups but equal to that recommended for high-risk groups, permits a risk above 1 per 1000, but is likely the shortest feasible interval.

  20. Determination of methylmercury and inorganic mercury in water samples by slurry sampling cold vapor atomic absorption spectrometry in a flow injection system after preconcentration on silica C(18) modified.

    PubMed

    Segade, Susana Río; Tyson, Julian F

    2007-03-15

    A novel method for preconcentration of methylmercury and inorganic mercury from water samples was developed involving the determination of ngl(-1) levels of analytes retained on the silica C(18) solid sorbent, previous complexation with ammonium pyrrolidine dithiocarbamate (APDC), by slurry sampling cold vapor atomic absorption spectrometry (SS-CVAAS) in a flow injection (FI) system. Several variables were optimized affecting either the retention of both mercury species, such as APDC concentration, silica C(18) amount, agitation times, or their determination, including hydrochloric acid concentration in the suspension medium, peristaltic pump speed and argon flow-rate. A Plackett-Burman saturated factorial design permitted to differentiate the influential parameters on the preconcentration efficiency, which were after optimized by the sequential simplex method. The contact time between mercury containing solution and APDC, required to reach an efficient sorption, was decreased from 26 to 3min by the use of sonication stirring instead of magnetic stirring. The use of 1moldm(-3) hydrochloric acid suspension medium and 0.75% (m/v) sodium borohydride reducing agent permitted the selective determination of methylmercury. The combination of 5moldm(-3) hydrochloric acid and 10(-4)% (m/v) sodium borohydride was used for the selective determination of inorganic mercury. The detection limits achieved for methylmercury and inorganic mercury determination under optimum conditions were 0.96 and 0.25ngl(-1), respectively. The reliability of the proposed method for the determination of both mercury species in waters was checked by the analysis of samples spiked with known concentrations of methylmercury and inorganic mercury; quantitative recoveries were obtained.

  1. Pan-European climate at convection-permitting scale: a model intercomparison study

    NASA Astrophysics Data System (ADS)

    Berthou, Ségolène; Kendon, Elizabeth J.; Chan, Steven C.; Ban, Nikolina; Leutwyler, David; Schär, Christoph; Fosser, Giorgia

    2018-03-01

    We investigate the effect of using convection-permitting models (CPMs) spanning a pan-European domain on the representation of precipitation distribution at a climatic scale. In particular we compare two 2.2 km models with two 12 km models run by ETH Zürich (ETH-12 km and ETH-2.2 km) and the Met-Office (UKMO-12 km and UKMO-2.2 km). The two CPMs yield qualitatively similar differences to the precipitation climatology compared to the 12 km models, despite using different dynamical cores and different parameterization packages. A quantitative analysis confirms that the CPMs give the largest differences compared to 12 km models in the hourly precipitation distribution in regions and seasons where convection is a key process: in summer across the whole of Europe and in autumn over the Mediterranean Sea and coasts. Mean precipitation is increased over high orography, with an increased amplitude of the diurnal cycle. We highlight that both CPMs show an increased number of moderate to intense short-lasting events and a decreased number of longer-lasting low-intensity events everywhere, correcting (and often over-correcting) biases in the 12 km models. The overall hourly distribution and the intensity of the most intense events is improved in Switzerland and to a lesser extent in the UK but deteriorates in Germany. The timing of the peak in the diurnal cycle of precipitation is improved. At the daily time-scale, differences in the precipitation distribution are less clear but the greater Alpine region stands out with the largest differences. Also, Mediterranean autumnal intense events are better represented at the daily time-scale in both 2.2 km models, due to improved representation of mesoscale processes.

  2. Cloud and Aerosol Retrieval for the 2001 GLAS Satellite Lidar Mission

    NASA Technical Reports Server (NTRS)

    Hart, William D.; Palm, Stephen P.; Spinhirne, James D.

    2000-01-01

    The Geoscience Laser Altimeter System (GLAS) is scheduled for launch in July of 2001 aboard the Ice, Cloud and Land Elevation Satellite (ICESAT). In addition to being a precision altimeter for mapping the height of the Earth's icesheets, GLAS will be an atmospheric lidar, sensitive enough to detect gaseous, aerosol, and cloud backscatter signals, at horizontal and vertical resolutions of 175 and 75m, respectively. GLAS will be the first lidar to produce temporally continuous atmospheric backscatter profiles with nearly global coverage (94-degree orbital inclination). With a projected operational lifetime of five years, GLAS will collect approximately six billion lidar return profiles. The large volume of data dictates that operational analysis algorithms, which need to keep pace with the data yield of the instrument, must be efficient. So, we need to evaluate the ability of operational algorithms to detect atmospheric constituents that affect global climate. We have to quantify, in a statistical manner, the accuracy and precision of GLAS cloud and aerosol observations. Our poster presentation will show the results of modeling studies that are designed to reveal the effectiveness and sensitivity of GLAS in detecting various atmospheric cloud and aerosol features. The studies consist of analyzing simulated lidar returns. Simulation cases are constructed either from idealized renditions of atmospheric cloud and aerosol layers or from data obtained by the NASA ER-2 Cloud Lidar System (CLS). The fabricated renditions permit quantitative evaluations of operational algorithms to retrieve cloud and aerosol parameters. The use of observational data permits the evaluations of performance for actual atmospheric conditions. The intended outcome of the presentation is that climatology community will be able to use the results of these studies to evaluate and quantify the impact of GLAS data upon atmospheric modeling efforts.

  3. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  4. 78 FR 13304 - Notice of Decision To Issue Permits for the Importation of Strawberry Fruit From Egypt Into the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ...] Notice of Decision To Issue Permits for the Importation of Strawberry Fruit From Egypt Into the... continental United States of fresh strawberry fruit from Egypt. Based on the findings of a pest risk analysis... strawberry fruit from Egypt. DATES: Effective Date: February 27, 2013. FOR FURTHER INFORMATION CONTACT: Mr...

  5. What Are We Doing When We Translate from Quantitative Models?

    PubMed Central

    Critchfield, Thomas S; Reed, Derek D

    2009-01-01

    Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533

  6. The scaling law of human travel - A message from George

    NASA Astrophysics Data System (ADS)

    Brockmann, Dirk; Hufnagel, Lars

    The dispersal of individuals of a species is the key driving force of various spatiotemporal phenomena which occur on geographical scales. It can synchronize populations of interacting species, stabilize them, and diversify gene pools.1-3 The geographic spread of human infectious diseases such as influenza, measles and the recent severe acute respiratory syndrome (SARS) is essentially promoted by human travel which occurs on many length scales and is sustained by a variety of means of trans-portation4-8. In the light of increasing international trade, intensified human traffic, and an imminent influenza A pandemic the knowledge of dynamical and statistical properties of human dispersal is of fundamental importance and acute. 7,9,10 A quantitative statistical theory for human travel and concomitant reliable forecasts would substantially improve and extend existing prevention strategies. Despite its crucial role, a quantitative assessment of human dispersal remains elusive and the opinion that humans disperse diffusively still prevails in many models. 11 In this chapter we will report on a recently developed technique which permits a solid and quantitative assessment of human dispersal on geographical scales.12 The key idea is to infer the statistical properties of human travel by analysing the geographic circulation of individual bank notes for which comprehensive datasets are collected at online bill-tracking websites. The analysis shows that the distribution of traveling distances decays as a power law, indicating that the movement of bank notes is reminiscent of superdiffusive, scale free random walks known as Lévy flights.13 Secondly, the probability of remaining in a small, spatially confined region for a time T is dominated by heavy tails which attenuate superdiffusive dispersal. We will show that the dispersal of bank notes can be described on many spatiotemporal scales by a two parameter continuous time random walk (CTRW) model to a surprising accuracy. We will provide a brief introduction to continuous time random walk theory14 and will show that human disperal is an ambivalent, effectively superdiffusive process.

  7. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less

  8. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  9. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  10. [The scientific revolution in medicine of second half of XX - early XXI centuries: occurrence of new conceptions about human organism and essence of diseases].

    PubMed

    Stepin, V S; Zatravkin, S N

    2016-01-01

    The article presents the results of analysis of works of supreme Russian physiologists and pathologists of XX-XXI centuries. The analysis was applied on the basis concept of structure and dynamics of scientific cognition developed by one o the authors of the present article. The applied analysis permits affirming that during second half of XX-early XXI centuries in medicine occurred and continues to occurring transformations whose character and scope totally corresponds to scientific revolution and occurring and establishing in medicine new conceptions have all signs permitting referring them to post-neoclassic type of scientific rationality.

  11. Mean-trajectory approximation for electronic and vibrational-electronic nonlinear spectroscopy

    NASA Astrophysics Data System (ADS)

    Loring, Roger F.

    2017-04-01

    Mean-trajectory approximations permit the calculation of nonlinear vibrational spectra from semiclassically quantized trajectories on a single electronically adiabatic potential surface. By describing electronic degrees of freedom with classical phase-space variables and subjecting these to semiclassical quantization, mean-trajectory approximations may be extended to compute both nonlinear electronic spectra and vibrational-electronic spectra. A general mean-trajectory approximation for both electronic and nuclear degrees of freedom is presented, and the results for purely electronic and for vibrational-electronic four-wave mixing experiments are quantitatively assessed for harmonic surfaces with linear electronic-nuclear coupling.

  12. Orthoclinostatic test as one of the methods for evaluating the human functional state

    NASA Technical Reports Server (NTRS)

    Doskin, V. A.; Gissen, L. D.; Bomshteyn, O. Z.; Merkin, E. N.; Sarychev, S. B.

    1980-01-01

    The possible use of different methods to evaluate the autonomic regulation in hygienic studies were examined. The simplest and most objective tests were selected. It is shown that the use of the optimized standards not only makes it possible to detect earlier unfavorables shifts, but also permits a quantitative characterization of the degree of impairment in the state of the organism. Precise interpretation of the observed shifts is possible. Results indicate that the standards can serve as one of the criteria for evaluating the state and can be widely used in hygienic practice.

  13. Age discrimination among basalt flows using digitally enhanced LANDSAT imagery. [Saudi Arabia

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.

    1984-01-01

    Digitally enhanced LANDSAT MSS data were used to discriminate among basalt flows of historical to Tertiary age, at a test site in Northwestern Saudi Arabia. Spectral signatures compared favorably with a field-defined classification that permits discrimination among five groups of basalt flows on the basis of geomorphic criteria. Characteristics that contributed to age definition include: surface texture, weathering, color, drainage evolution, and khabrah development. The inherent gradation in the evolution of geomorphic parameters, however, makes visual extrapolation between areas subjective. Therefore, incorporation of spectrally-derived volcanic units into the mapping process should produce more quantitatively consistent age groupings.

  14. ASSAY OF POLY-β-HYDROXYBUTYRIC ACID

    PubMed Central

    Law, John H.; Slepecky, Ralph A.

    1961-01-01

    Law, John H. (Harvard University, Cambridge, Mass.) and Ralph A. Splepecky. Assay of poly-β-hydroxybutyric acid. J. Bacteriol. 82:33–36. 1961—A convenient spectrophotometric assay of bacterial poly-β-hydroxybutyric acid has been devised. Quantitative conversion of poly-β-hydroxybutyric acid to crotonic acid by heating in concentrated sulfuric acid and determination of the ultraviolet absorption of the produce permits an accurate determination of this material in quantities down to 5 μg. This method has been used to follow the production of poly-β-hydroxybutyric acid by Bacillus megaterium strain KM. PMID:13759651

  15. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  16. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  17. An Quantitative Analysis Method Of Trabecular Pattern In A Bone

    NASA Astrophysics Data System (ADS)

    Idesawa, Masanor; Yatagai, Toyohiko

    1982-11-01

    Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis

  18. IMCS reflight certification requirements and design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.

  19. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  20. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

Top